Following on from Part II: Intro, this report will establish the importance of humans in our systems and explore how we can integrate zero-trust principles into operational security (opsec).
Task
Create a ten minute minute presentation which builds on the previous presentation and:
- Explain the information environment
- Outline threats to the information environment
- Explain how attack effects can affect operations
- Introduce opsec and outline its benefits
- Introduce zero-trust and look at how its principles could be applied to opsec
Presentation
In this presentation we explore the human element of cybersecurity.
Transcript
Slide 1
Thanks for joining me. Last time we went through an overview of the project, thought I’d do a quick recap, before we get into today’s presentation.
- We learnt that 68% of Australian Government Reportable Data Breaches – as defined in the Privacy Act – and more generally 88% of cyber incidents were caused by human error.
- We established that security in the way we work can assist in reducing human error and that operational security is an applicable approach.
- We introduced zero-trust and I proposed that the “zero-trust culture” called for in the Australian cybersecurity strategy could be achieved by applying of zero-trust principles to operational security.
- With that in mind, the phrase “zero-trust culture” could potentially send a negative message to most people – without the explanation I gave last time, it sounds like a place where staff don’t trust each other – so we’ll park the phrase for now.
- We’ll look at the causes of Australian Government data breaches from a human factors perspective.
- We’ll introduce a few models used in cyber and information security.
- Then we’ll bring this together to establish the importance of human element in cybersecurity.
Today we are going to deep dive into the human element of cybersecurity.
Let’s get into it.
Slide 2
Let’s have a closer look at the reportable data breaches affecting the Australian Government over the last reporting period. We’ll leave out system errors since there weren’t any – well done IT teams!
- There were 26 human error incidents including: 13 of Personal Information being sent to the wrong recipient; 11 of Unauthorised disclosure; and 2 of paperwork or storage devices being lost.
- Of the 12 malicious or criminal incidents: 8 were Social engineering – that’s when a bad actor manipulates staff to give them what they want; 1 Cyber incident, that’s things like hacking, ransomware, stolen credentials; 1 of stolen paperwork or storage devices; and 2 incidents of Rogue employee or Insiders
This is interesting, because a significant number of the malicious attacks actually come back to human factors. Let’s dig into this.
Slide 3
There were two incidents involving insiders, there are four types of insiders, let’s have a closer look at them:
- Accidental, often due to: a process or procedure failure. (OAIC, 2024); rushing (Prabhu & Thompson, 2021, pp. 606–607) and performance pressure; and social engineering (Prabhu & Thompson, 2021, p. 608), which we’ve just introduced.
- Negligent, people flout rules not realising that their actions pose a risk.
- Mischievous, people flout rules, understanding the risk but without malicious intent.
- Malicious, people flout rules, with malicious intent.
These cover the spectrum of human errors, and we can use them to reclassify the government data breaches through a human factors lens, giving us:
- 34 accidental insiders,
- 2 malicious insiders.
36 of 38, that’s 95% 0f reportable incidents by the Australian Government that are due to the human factors of staff.
As an organisation, we understand human factors in a safety context, and we publish material on it, such as the ‘Safety behaviours human factors engineers resource guide’ (CASA, 2013, p. 13). It stresses the importance of authorised processes, and of standard operating procedures so that staff can perform consistently to standard. Now this sounds like “the approach to work” that we covered last time to me, there are parallels to this project, but we’re going to be considering them in a cybersecurity context, let’s have a closer look at this.
Slide 4
Now that we’ve established the role of human factors in data breaches, let’s look at the importance of humans from a security perspective. I’ll start with the information environment. It is a broad space where we are all connected by our use of technology (UNSW, 2022). It include three dimensions:
- The Cognitive dimension, where human decision making takes place. It can be impacted by intangibles such as morale, team cohesion, public opinion and situational awareness (UNSW, 2022).
- The Informational dimension, is data centric (U.S. Government Accountability Office, 2022, p. 1). This is where information is collected, processed, stored and disseminated. It links the physical and cognitive dimensions and is also where automated decision making takes place (UNSW, 2022).
- And the Physical dimension, this the tangible real-world (U.S. Government Accountability Office, 2022, p. 1). This is where the information environment overlaps with the physical world (UNSW, 2022) and contains: Us, the human beings, office facilities, newspapers, books, networking and communication infrastructure, computers and devices (U.S. Government Accountability Office, 2022, p. 1)
There’s an interrelationship between these dimensions and that they can affect one another, for example: compromised data – which is the informational dimension – can impact decision making which is reliant on the data – which is the cognitive dimension.
Let’s explore how we can approach this challenge.
Slide 5
Last time, I mentioned that cybersecurity has become a socio-technical problem. What does this mean?
A socio-technical system, is a system that has integrated human and technical elements (Davis et al., 2014, p. 171), it’s the entire system and not just one of its dimensions.
This is relevant because it’s been found that optimisation of one dimension of a socio-technical system can negatively affect the overall system performance (Malatji et al., 2019, p. 241). This has been the case with cybersecurity at a broad level, for some time, with a focus on the technology rather than the whole system to solve our information security problems (Prabhu & Thompson, 2021, p. 604). This may explain why cyber attacks are becoming increasingly successful (Malatji et al., 2019, p. 241).
Let’s look at the elements of this system.
Slide 6
The pillars of People, Process and Technology, have a range of applications in organisations, including cybersecurity, where they can serve as the elements of our socio-technical system which we just covered. The “people” part of cybersecurity is the most common threat vector, it is also the most challenging to handle (Couretas, 2018, p. 104). We just covered that the focus of cybersecurity is usually on technology, but popular standards such as ISO27001 and NIST-CSF – which both guide CASA – do contain items relating to people, these include:
- General staff responsibilities,
- Information security awareness, education and training, and
- Disciplinary processes
Does this balance the technology focus I mentioned earlier? These are all related to people, but more to people’s relationship with IT systems and adherence to rules, not considering the people themselves. Do they help people? I think they help organisations to be compliant.
I also think they shift the responsibility to staff, adding to their cognitive load. Don’t get me wrong, cybersecurity is everyone’s responsibility – you may have read my recent article on Horace on this topic – I’m saying everyone can play their part, without us adding to staff’s cognitive load.
Slide 7
Before we look at how we can create a workforce that is responsible for cybersecurity, without additional cognitive load, let’s look at why we should. Let’s consider our most recent APS survey where:
- 33% of CASA staff are burnt out by their work.
- 79% find it stressful
- 70% find their work emotionally demanding
- Only 50% think CASA is concerned with their well being
The last thing these people need is something else to consider while doing their job. So we need to find a way to engage our people in cybersecurity, without adding to them feeling stressed and burnt out by their jobs.
Recognising the human factors in cybersecurity, what can we do about it? Our Security Handbook has good processes and guidance to follow, but I’m talking a deeper level than that – by embedding security into the way we work, so that just by following processes or procedures to complete a task, our staff are acting in a secure manner – acknowledging there may be a learning curve initially,
We can improve our security posture without adding to staff’s cognitive load.
This is where operational security comes in. This is easy to talk about in theory, but how do we make this implementable? In the upcoming report, we’ll explore how zero-trust principles can be applied to our operational security to give us a structured and implementable approach so that we can mitigate human factors while maintaining our just culture and not blaming people when things go wrong.
Slide 8
Ok, we’ve covered a few topics today as we explored the importance of humans in cybersecurity:
- We looked at Australian government data breaches, and found that 95% had a significant human factors element .
- We looked at a few models relevant to cybersecurity, including:
- the information environment, with the interconnected cognitive, informational and physical dimensions
- We looked at socio-technical systems, and the recognition that the whole system needs to be treated, and that focusing on one dimension can reduce the overall system performance
- and we covered the three pillars of People, Process and Technology
- Lastly we discussed the need to consider how to engage people in cybersecurity without adding to their cognitive load
What we’ve done today is demonstrated that human errors in cybersecurity occur in a complex, interconnected environment. I could have just relied on research in the upcoming report that confirms the importance of human factors in cybersecurity, but I think it’s important to demonstrate that it’s not as simple as saying: “it’s people’s fault, we’ll just write policy and give them training”. Yes, people make mistakes – by accident or induced by bad actors – but they are often caused through systemic issues and mitigating these mistakes requires changes to the way the system operates.
Slide 9
Before I wrap up, I’d like to add that we are on trend with this project. There is a shift toward understanding how human factors impact cybersecurity (Prabhu & Thompson, 2021, p. 604), both in academic literature – evidenced in my references – and also within organisations, including federal government – for example the Department of Parliamentary Services was recently seeking Directors who among other things are responsible for “embedding consistent cyber hygiene” and development of “human-centric cyber defence programs”.
So we won’t be alone, I think we are looking at this at the right time.
Our goal is to improve the security posture of our people, without adding to their cognitive load – hopefully leaving them more space for engagement. I’m looking forward to exploring this int he upcoming report.
Any questions or thoughts?
References
CASA. (2013). Safety behaviours human factors engineers resource guide. CASA. https://www.casa.gov.au/sites/default/files/2021-06/safety-behaviours-human-factors-engineers-resource-guide.pdf
Couretas, J. M. (2018). An introduction to cyber modeling and simulation. John Wiley & Sons.
Malatji, M., Von Solms, S., & Marnewick, A. (2019). Socio-technical systems cybersecurity framework. Information & Computer Security, 27(2), 233–272. https://doi.org/10.1108/ics-03-2018-0031
OAIC. (2024, February 22). Notifiable data breaches report: July to december 2023. OAIC. https://www.oaic.gov.au/privacy/notifiable-data-breaches/notifiable-data-breaches-publications/notifiable-data-breaches-report-july-to-december-2023
Prabhu, S., & Thompson, N. (2021). A primer on insider threats in cybersecurity. Information Security Journal: A Global Perspective, 31(5), 602–611. https://doi.org/10.1080/19393555.2021.1971802
UNSW. (2022, April 8). 1.2 The information environment. Cyber Operations. https://moodle.telt.unsw.edu.au/mod/page/view.php?id=4523357
U.S. Government Accountability Office. (2022, September). GAO-22-104714, information environment: Opportunities and threats to DOD’s national security mission. https://www.gao.gov/assets/gao-22-104714.pdf