Safe and secure?
Graham Burdis: Over the years, we have hosted a number of cybersecurity roundtable discussions and while the level of sophistication in terms of attacks has heightened, the one thing that has remained the same is directors’ discomfort about IT and cybersecurity. Has the nexus between IT, cybersecurity and the board gotten worse?
Richard Watson: My perspective, on Australia in particular, is that boards have woken up to the cyber issue, particularly over the last 24 months. I think that has been driven largely, as you say, by fear and uncertainty about whether they’re directly responsible and accountable for cybersecurity and what the implications for them as individuals may be. I also think this is heightened by the mainstream media. Directors are now asking questions such as: “Are we secure?” I think some of the questions they’re asking are the wrong questions, but at least they’re asking questions. The dialogue has started and there is engagement there, but the level of understanding on boards is probably still behind where the actual industry challenges really lie.
Jinan Budge: I agree. I think it has gotten better, not worse. It has certainly intensified and it has intensified for a number of reasons. One of those reasons is certainly the media. I remember that in 2008, we used to have to jump up and down to talk about the subject of security with senior executives and with the board. That’s absolutely no longer the case. The questions are now coming from the top – so there has certainly been that change.
I also agree with Richard that it’s not always the right questions that are being asked and there’s a huge opportunity in bridging that gap and communicating more effectively. In fact, I think it is the biggest gap. The global experience has started to change to reflect that closure of the gap. It’s not always people with a traditional technical security background that are being hired into these positions anymore, those days are done and dusted. There are people from communications, change management, program management and other diverse backgrounds, which reflects that need to close the gap. More importantly, we need more people with leadership skills.
Christoph Strizik: I think there is a recognition that it goes beyond IT. It’s not just an IT problem. If you look at cyber risk, it is essentially a driver to all your existing risks. If you are building brakes for cars you need to build them to a certain safety standard and find out if there is a cyber risk driver that can impact the safety of your brakes. This is not about IT, this is about safety, operational risk management, and therefore the skillset you need is less IT and more about broader risk management and stakeholder engagement.
Graham Burdis: Are directors aware of the changing needs?
Sarv Girn: I think boards are now very aware that cybersecurity is a key matter in terms of the risk it presents, and for some, the opportunity brings greater revenue if their strong credentials in this area are promoted to customers. In terms of the education of directors, I think there is too much scare mongering going on. While there are many threats out there, and the consequences of these can be material to an organisation, as an industry we need to first focus on having the right information framework that allows directors to ask the right questions and spot the red flags. This is akin to directors being able to exercise their fiduciary responsibilities with a well-defined financial framework that accountants and auditors have embedded into corporate governance. In addition, legal obligations are also well defined in a framework of reporting requirements. Financial and legal frameworks have been around for years, decades, if not centuries in some instances.
Unfortunately, in the cybersecurity world there isn’t a definition of what is good and how to go about measuring and reporting this at a board or organisation level. I think it’s about time a body, whether it is a standards body or a governance body, focuses on establishing the right framework, with the necessary reporting to the board, that makes the health of a company’s security crystal clear.
Christoph Strizik: It’s a good point. The cybersecurity industry is the fastest growing industry at the moment and security vendors are contacting directors to arrange meetings, but they’re all coming in with fear, uncertainty and doubt (FUD). It is not a good approach to build a business case based on FUD. We need to be coming back to what’s right for the organisation.
Richard Watson: I think some of the questions directors are asking feeds that FUD because they’re asking simple basic questions about numbers of attacks and security. It might be better to ask questions about how to respond to a cyber incident and what is the company’s risk appetite for cyber. Boards need to accept that they are going to have some breaches, but the issue is which ones will they accept and which ones will they not accept – what are the threats out there and what are their competitors doing? There’s a more pragmatic set of questions that can be asked but that comes with a better understanding of the topic than a simple facts and figures approach about viruses or incidents.
Sarv Girn: Having a pragmatic set of questions is exactly where it should start. In our risk appetite, we have a statement on cyber that identifies our risk appetite and willingness to accept an incident. Behind this is a framework that enables articulation of other aspects related to operational risk, such as the availability of systems and recovery times. A concept of tiering is used to identify technology assets in terms of their importance to the bank’s daily operations. This framework then allows the right questions to be asked.
It’s about having a risk-based approach, and I think the discussion in a boardroom needs to focus on a range of frank questions. What is the risk appetite around cyber? How many incidents are acceptable and for how long? How many customer records are you willing to lose? These are hard questions, but they need to be considered by the executive and board at some stage.
Graham Burdis: In terms of legal frameworks, they have already been published. But the set of rules and frameworks you are talking about still need to be established. There is also a disclosure element to that as well. There seems to be a reluctance to disclose within boards and organisations – globally, not just in Australia. And because there is no disclosure, it is very hard to find how many cyber attacks a company may have suffered. They don’t want to discuss it. Surely this continues to be an issue.
Christoph Strizik: I think you have to mindful of what you’re trying to achieve. The customer always comes first. If you have a customer data breach, that’s quite serious and can lead to harm and you should really tell your customers about that. But if the website is being impacted and falling over, depending on the website, there might be a lower impact on customers and there is little value in reporting.
I think companies need to be looking at what they are trying to achieve with any reporting. All companies need to submit reports to the ASX and outline the key risks under the ASX principle of risk management. Any organisation that has identified cyber risk as a key risk would probably disclose that. While these reports may not tell you how many incidents a company has had, what they will tell you is that the organisation identifies cyber risk as a key risk and a strategy and plan is in place.
Perhaps there is one aspect to add to this, and that is if you look at the existing frameworks you mentioned earlier, you can use each of them and adopt a cyber angle. You can apply all the frameworks that exist already; you don’t need something separate.
Jinan Budge: I agree with you 100 per cent and in fact, I would worry about introducing yet another standard and framework. I think the key thing that has worked very successfully is customising any of these frameworks to a particular organisation, its risk profile and understanding the threat landscape for the particular organisation. It’s also very important to embed security in existing governance committees within organisations if possible, rather than creating new ones.
Sarv Girn: Until the point where cybersecurity is more formally built into corporate governance, there are many existing industry standards that can be utilised as a starting point. The trick is to be clear on which standards a company will align to, comply with, or aim to exceed the requirements. Companies also need to have a conscious approach to measuring and tracking compliance. Just pick the one or two standards that are relevant to your organisation and that can be in your risk profile’s baseline against which you provide assurance.
Graham Burdis: Would that be a good starting point for directors?
Sarv Girn: Yes. For example, we started off as a minimum complying with the Top 4 of the Top 35 ASD cybersecurity strategies, which provide protection against over 80 per cent of known attacks in the current climate, and very soon we will have full compliance with the remaining strategies. In addition, we have embraced the ISO standards as well. We have just got independent accreditation that our internet gateways are now ISO 27001 compliant. A focus on key standards allows us to provide greater assurance on the maturity of cyber controls.
Richard Watson: This is where risk appetite comes in. You have to get the board to a place where they understand that the business cannot be 100 per cent cyber proof. It just doesn’t exist and it’s impossible to achieve. Instead, businesses need to look at what’s most important in terms of protection and what needs a higher level of control. That way you get differential investment. From there, they can then work on the response. This year’s Census is a classic example where the response was not prepared. Similarly, the response by UK telco TalkTalk to a customer breach last year made it one of the poster child incidents of cyber attacks in recent times. It wasn’t handled in the most effective way, which resulted in focus and attention being heaped upon it and customer attrition. Other telcos have had similar types of incidents but have handled them very compactly and precisely, meaning they were almost unreported. I think focusing on response is key and that requires an acceptance that things are going to happen. Companies need to think in those terms.
Kelvin McGrath: Boards are used to making lots of risk-based decisions and everyone has talked about how framing that appetite needs to be within the same frameworks that you currently have now. I definitely agree with that.
But I wonder if we need to go a little bit further on what Sarv said about having standards that are adhered to and maybe Australia as a whole needs to say if I’m going to invest in a top 100 company, I’d expect that company to have a cybersecurity level of X.
The severity of what can happen with cybersecurity has dramatically increased over recent years and maybe it is time to introduce a standard. It wasn’t time 10 years ago, it might have been time five years ago, it probably was time two years ago. It’s definitely time now. We need to stand up as a country and say that when you invest in a top 100 company here are the standards that are required.
Jinan Budge: I do not support the position of having to comply to a particular standard. This is simply because I have been in this profession for 18 years and I have seen a very big difference between companies ticking a box (for example by getting a certification to a standard such as ISO 27001 or 27002 or others) versus actually achieving risk appropriate security. Ticking a standards compliance box does not necessarily equal security.
Kelvin McGrath: What I’m suggesting is that we learn from some of the other disciplines. Look at what’s happened with environmental reporting and the way we approach environmental matters; this has been elevated and now directors consider environmental reporting when making risk-based decisions. There are standards that we need to adhere to and to be a proper Australian company then security can surely do the same thing.
Richard Watson: I think that’s the way they’re going in the US and I would expect over the next five years for that sort of logic to also manifest itself here in Australia. As you say, now environmental reporting is part of your financial statements and therefore your audit. I think cyber should be too.
Graham Burdis: I have the figures from a survey conducted at the Black Hat USA 2016 conference. Respondents to the survey said that the priorities set by the business are not the priorities considered most important by security professionals. This seems to illustrate a dysfunction; we’re saying that they should be complying and there should be standards, but the professionals think the focus should be elsewhere. How do you resolve that dysfunction?
Sarv Girn: That’s where it comes down to choosing the right standard that’s fit for purpose for your business. It may not be ISO, it may not be ASD or the government standards; it may just be PCI that relates to the payments industry. Some of the standards are actually very technical in nature; they are not just about governance, process and reporting. Choosing the right ones must be done very carefully.
Richard Watson: I think the other thing that might help address the current disconnect you described is a risk-based approach. I think the danger of a standards-based approach is that you are spraying your energy and effort far and wide to achieve a standard instead of actually focusing on the real risks that your organisation faces based on the threat intelligence that you have, or the risk appetite that you have, or the valuable information that you have.
Clearly, standards form a part of how you understand a control environment and the level of maturity of an organisation, but they don’t really tell you where to focus your energy and differentiate your investment. I think it should be a combination of taking a standards approach but then overlaying a more pragmatic view of risk on top of that.
Jinan Budge: I’m going to add something else to both perspectives. We need to have a different way of hiring; a way of diversifying our hiring processes within cybersecurity. We desperately need people with diverse backgrounds and qualities such as governance, risk management, communications, conflict resolution, stakeholder management. We need these as well as the more traditional technical and operational skillsets such as incident handling, penetration testing, vulnerability management and patching.
Vicki Anderson: Referring to what Graham said, I would not say it is a “dysfunction”, however the desire to comply with security standards needs to be carefully balanced with an organisations risk appetite as set by the board. While security standards set a benchmark, they cannot be adopted to the detriment of the organisation’s goals.
In my organisation we have over 300,000 students on our network. This sets us an interesting security challenge. The risk appetite has to be very clearly articulated as it is a little bit different to most corporates in that security or cyber safety is defined by the potential impact on students.
The risk framework defines that risk appetite, is set by the board and it is managed by the audit risk committee (ARC). Cybersecurity is managed through the CEO and the CIO. No one person has carriage of cybersecurity, it is a responsibility in everyone’s position description. Every incident that exceeds a particular size is monitored by the ARC and reported to the board.
One of the things that we have worked very hard on is connection. IT staff are very aware of the technical impacts of cybersecurity but it helps when they understand how important their diligence is to the organisation. The chair of the ARC provides that connection through communications at IT staff meetings or more broadly through video, articulating the appetite for risk and the key important factors. She doesn’t come down onto the IT shop floor and manage the staff, she’s a director, she keeps her distance, but provides that connection.
Richard Watson: I think that really reinforces the point around helping people understand the impact of what they are doing and why it is important in terms of making actions of culture stick.
Christoph Strizik: Perhaps this is where we can learn from other industries and plan a bit better. For example, if we look at the UK, in 2013 voluntary reporting on cyber governance was introduced for the top 350 companies. These companies are being asked to report on how cyber is governed and this is measured on a number of dimensions.
I think it’s a good first step to understanding how well cybersecurity is managed in an organisation. I believe the Australian government is in the process of establishing something similar.
Richard Watson: That is actually being rolled out here. It’s being done over the next six months, with a view to unveiling the first set of these results in March. It’s highly aligned to the UK model.
Christoph Strizik: Risk-based planning is key as well. Organisations produce so much information every day and you can’t protect all of it equally. What you need to find out is, what one per cent of information assets, if compromised, would significantly harm the organisation or its stakeholders. Once you focus and identify these information assets then you know exactly where to apply the standards and controls that you mentioned before. I guess the concern I have with some of the guidance out there is that it focuses mostly on technical controls. You have to take a holistic approach and look at the business process controls as well as people controls such as educating your workforce on safe and unsafe behaviours when working with information or technology assets.
Graham Burdis: One of the challenges is that statistics are difficult to use. It has been reported that after cyber training, 48 per cent of people that received training were not able to spot a cyber attack within one month after the training. I think boards have got to be careful of this “warm shower” effect where they feel everything is fixed but it’s not.
Jinan Budge: That’s why you have to be so careful about any tick-the-box exercises. In my professional experience and judgement, while important, computer-based training for cybersecurity can be one of those tick-the-box exercises. As passionate as I am about educating, influencing and engaging, training can be one of the least effective ways to create the behavioural changes that are desperately required in cybersecurity. We should not be investing important time and energy into that particular conversation. Rather, we should focus our attention on broader engagement, educational and influence campaigns that create an impact, that are personal and which actually achieve behavioural change objectives.
Christoph Strizik: It is important but I don’t think it solves the problem.
Sarv Girn: I don’t think it’s a case of don’t do the user training. It is about also having an ongoing focus around the themes in that training. For example, most training materials will focus on the syntax of your password such as eight characters long, numbers and letters. But you need to follow it through with ongoing awareness. For example a monthly or quarterly dashboard that says IT security was able to guess a number of passwords across specific areas of your organisation. That creates awareness, that creates a culture, and that creates a bit of competitiveness as people don’t want to be on that list of individuals with an easy to guess password. The annual training then is brought to life throughout the year.
Christoph Strizik: Passwords and typical security questions are ineffective. People pick passwords that can be easily hacked and information such as: “What is your mother’s maiden name?” can be easily found out on social media.
Graham Burdis: Do things like dual verification help? The mobile phone is ubiquitous across the globe and we all have one unique identifier – our mobile phone number.
Richard Watson: It increases the security profile, but it gets in the way of the customer experience, particularly when people are travelling overseas and have to pay $3 to receive a text message.
Christoph Strizik: I call it adaptive security. Depending on the transaction and the risk context, you ramp up your security if needed. If someone is making a transaction that appears unusual, then maybe you have to use the fingerprint on your phone or a one-time code sent via SMS to authenticate yourself. The customer experience or usability versus the security of an app or website needs to be carefully balanced and adaptive security helps with that.
Graham Burdis: How does a board know when they have reached a point where spending more money isn’t going to create a return? At what stage do they feel comfortable about making that decision?
Kelvin McGrath: If it’s a risk-based approach it is about the law of diminishing returns. If you want to get to 99 per cent success rate, then that will cost you a lot more than getting to 80 per cent. It’s the same as all other risk-based decisions that a board makes, so I don’t see the conversation as any different.
Christoph Strizik: I guess the other aspect is using security investment to enable your organisation to consume emerging technologies safely. Rather than fearing the cloud, apply controls that monitor whether important assets are leaving the organisation or are perhaps going to unsafe places in the cloud. Solutions in this space are emerging but you have to be prepared to work with start-ups and innovators in the same way that your business wants to innovate by adopting emerging technologies. Use security as an enabler that supports the business journey.
Graham Burdis: Is this approach filtering down to the SME market?
Christoph Strizik: I think that is the elephant in the room. In Australia, more than half of our gross domestic product (GDP) is from small-to-medium businesses and more than 80 per cent of those are online. They are users of internet technology, but they have limited cybersecurity awareness or capability. When they have incidents, it means they are out of business because they haven’t got the resilience or the cash flow strength to sustain these impacts. There needs to be more support provided to those organisations.
It’s interesting, because most people think that the cloud is not secure. I would almost argue if you are a small business and you are not going into the cloud you are missing out on security features that you wouldn’t be able to afford yourself. So there’s actually a strong argument for businesses to leverage some of these modern technologies.
Richard Watson: I think the other feature of small business is obviously that the human element becomes much more important because they haven’t got the dollars to invest in technology controls. Australia is very unique in that the share of GDP that small businesses create in this country relative to other first world economies is very high. Those small businesses are often serving large businesses as well.
Graham Burdis: So how do directors of larger companies manage that supply chain risk?
Vicki Anderson: When there are multiple suppliers in the IT delivery chain you need to ensure that your vendor has the appropriate security controls to satisfy your risk audit and that their suppliers also have appropriate security controls. Ideally this would be determined during the tender process, however this is not always the case, so controls must be drafted to allow for periodic review or audit.
Unfortunately, I have found with our small suppliers that although the contract stipulates periodic control reviews, when it comes to performing the audit, not only are they unaware of the compliance standards, often they have not been asked to provide evidence of controls.
While compliance is not preeminent, certification to provide recognition that you have to be able to achieve a certain level of cybersecurity compliance could provide confidence in multi-vendor supply chain agreements. Rather than imposing additional compliance activities, perhaps the annual external audit could be leveraged to achieve this confidence.
One of the things that does concern me is that there seems to be a recent rise in the number of “security professionals” pedalling FUD about sensational issues that divert the board’s attention from real and present risks.
Jinan Budge: This goes back to the issue of the people we hire and develop in the profession. We must change our hiring and people development processes. The more we only focus on technical training for security, the more at risk we are as a profession of becoming irrelevant. This is absolutely one of the biggest issues facing our profession. Our people hiring and development has to be diversified. We have to bring different people with different skillsets into the profession. That’s going to invite different thinking and alternative ways of thinking and presenting.
Kelvin McGrath: The people we’re bringing through and educating shouldn’t just be science, technology, engineering and maths (STEM) graduates. We need additional disciplines to actually go into security as well.
Jinan Budge: Globally that’s what’s happening. The chief information security officers that are being recruited in global organisations are no longer necessarily from security backgrounds and long may that continue. I can’t wait until it starts happening here in Australia.
Richard Watson: I think that’s something that a lot of organisations in Australia are exploring at the moment; who is accountable for cyber risk and who is responsible for operating controls? The person accountable for cyber risk often now reports directly to the CEO. We’re also seeing cybersecurity converge with other areas of security, such as physical security, IT security and fraud. Accountability for these risks is often combined under one chief security officer type role, largely because they’re all inter-related.
Christoph Strizik: I think the fundamental point is that you can never be 100 per cent secure.
Vicki Anderson: I think a board can’t rely 100 per cent on their internal security capability. The board should engage external consulting organisations periodically to gain a view of what’s over the hill. Often you don’t know what you don’t know, but you have to be really careful not to engage an organisation that merely reinforces your own fears.
Christoph Strizik: You’ve touched on something that we really need to think about. The fact is we can’t do this by ourselves. It needs collaboration across industries and government.
Richard Watson MAICD
LEAD PARTNER ANZ CYBER SECURITY RISK MANAGEMENT, EY
Sarv Girn FAICD
CHIEF INFORMATION OFFICER, RESERVE BANK OF AUSTRALIA
Christoph Strizik AAICD
HEAD OF IT RISK AND CYBER SECURITY, ORIGIN ENERGY
Vicki Anderson MAICD
CHIEF INFORMATION OFFICER, CENET
Kelvin McGrath GAICD
FOUNDER, MEETING QUALITY
CYBER SECURITY STRATEGY AND ENGAGEMENT LEADER, QANTAS