Cyber Security

Luke Hally

Zero-Trust Culture IV: Capstone Report

September 21, 2024
Categories:

Following on from Part III: Presentation, this report will establish the importance of humans in our systems and explore how we can integrate zero-trust principles into operational security (opsec).

Task

Write a report that investigates the concepts raised in the previous presentations which:

  • Explains the importance of the human element of cyber security
  • Provides detailed background on zero-trust and operational security.
  • Demonstrates integration of of zero-trust principles to operational security
  • Provides a clear explanation on the need to expand the scope of organisational information security and how the above point can assist.

Report

Table of contents

Note: with the exception of “socio-technical” in this report the author has replaced the word “technical” with “technological”. This is due to the organisational meaning of the word “technical” within the stakeholder organisation which refers to staff with specific operational knowledge, experience or roles.

1. Introduction

In our previous presentations we explored the ‘zero-trust culture’ called for in the Australian Cyber Security Strategy, concluding that systemic changes were needed to mitigate human factors in cybersecurity. This report will explore these topics in more detail to build an understanding of a human-centred cybersecurity culture.

Cybersecurity is often viewed as a technological problem (Nosworthy, J., 2000) with an emphasis on technological solutions (Prabhu & Thompson, 2021, p. 604). However it is more of a people problem (Rainer et al., 2007, as cited in White, 2009). The human element is the threat vector that is most common and also most difficult to mitigate (Couretas, 2018, p. 104). This was evidenced by 95% of Australian Government notifiable data breaches having a significant human factors element (Hally, 2024b). Being socio-technical in nature, with integrated human and technological elements (Davis et al., 2014, p. 171), this technological focus may explain why cyber attacks are becoming increasingly successful, because optimisation of one socio-technical element can have a negative impact on the overall system (Malatji et al., 2019, p. 241). We need to treat the whole system to improve it, i.e., humans and technological elements (Prabhu & Thompson, 2021, p. 604). 

As discussed previously, traditional approaches to the human element which include policy, training and awareness may shift the responsibility for cybersecurity to staff. This could add to their cognitive load, and considering the stakeholder’s recent APS Census results (CASA, 2023b) which indicated that staff are overburdened, may counter our commitment to the “psychological health and safety of workers” (CASA, 2022). ​This report will explore this challenge by establishing the importance of humans in our systems, provide an overview of operational security (opsec) and zero-trust before discussing how to develop a human-centred cybersecurity culture, where culture is “the way we do things around here” (Lundy & Cowling, 1996, as cited in Corradini, 2020, p. 64).

2. Importance of the human element

In this section, we will investigate the unique roles and attributes of humans within systems and the vulnerabilities they introduce so that we can better understand and mitigate them.

In considering the human element of cybersecurity, there is alignment with another socio-technical system which we can learn from, Cyber–Physical–Human Systems (CPHS) (see Appendix A). These are interconnected systems of physical systems, computing and communication systems, and humans (Sowe et al. 2016, as cited in Annaswamy et al., 2023, p. xxvii) with a focus on interactions between the cyber-physical and humans (Annaswamy et al., 2023, p. xxvii). 

Humans are a critical part of CPHS (Huang & Zhu, 2023, p. 1) fulfilling a number of roles: system controller, system user, consumer of output, and as a system input (NIST, 2017, p. 4). CPHS can potentially enhance human capabilities of “sensing, decision-making, and action.” (NIST, 2017, p. 17). However, it is yet to be determined whether this will lead to higher levels of “responsibility and decision-making of the employees or towards higher technological control” (Fantini et al., 2020, p. 1).

This is an important question, not only because it will inform how human vulnerabilities are treated but also have societal implications. Recognising this, ethical frameworks apply and the universalisability test is suitable – applying a maxim universally to see if it has a desirable outcome – our maxim being “machines should control humans in the workplace”. If machines control humans, humans may be viewed as units of production which may impinge on the autonomy or dignity of people as moral agents. It could also potentially breach human rights of privacy, freedom from arbitrary surveillance, and to just and favourable conditions of work (United Nations, 2009), among others. We wouldn’t want a world like this, as Fantini et al. state we want a world where technology is “the means for workers to continue to work instead of being replaced” (2020, p. 2), i.e., the people are in charge and “essential to the functioning of the socio-technical system” (Zimmermann and Renaud 2019, as cited in Corradini, 2020, p. 60).

Having established this, we need to have regard for peoples’ dignity and autonomy as we consider how to treat their unique cognitive vulnerabilities (Huang & Zhu, 2023, p. v). These vulnerabilities fall into two categories: 

  • Acquired vulnerabilities stem from a lack of awareness or knowledge and non-compliance to rules which “can be mitigated by security training, education, and incentive programs” (Huang & Zhu, 2023, p. 57).
  • Innate vulnerabilities involve attention, risk perception, and decision-making. They can be exploited through cognitive overload (Huang & Zhu, 2023, p. 50) as well as the incorrect belief that risky behaviour is not risky (Huang & Zhu, 2023, p. 54), or, risk ignorance. Innate vulnerabilities cannot usually be mitigated by the usual approach of training or security education (Huang & Zhu, 2023, p. 49).

Innate vulnerabilities pose an interesting challenge both because the accepted approach of training and awareness is ineffective and because the triggers can be caused by internal or external factors, as outlined in table 1. 

Table 1: Examples of innate human vulnerability triggers

Cognitive overloadRisk ignorance
External (Adversary)Inundating a public facing inbox with applications/enquiries which appear legitimate, taking up staff time and energy, reducing morale and impacting process availability.Scammers impersonating government agencies set up to protect victims from scammers (Maguire, 2024), leads to victims thinking they are doing the right thing while being scammed.
Internal (Organisational)Processes or procedures which have extraneous steps/approvals or are confusing. Multiple or constantly changing points of contact. Fluid communication channels (e.g., an undefined mix of email, chat, phone). Can lead to cognitive overload due to the need to focus across multiple areas.Incorrect/incomplete documentation or advice/direction or lack of documentation; mistakes in previous process stages. Leads to people doing things the wrong way without realising.

In order to mitigate innate vulnerabilities, Huang & Zhu recommend using AI. While this approach may have merit in particular use cases, it does not inform a cybersecurity culture. According to Alvero, to achieve this, staff’s core responsibility regarding cybersecurity is to follow organisational policies and procedures (2021), but as already noted, this approach can add to the cognitive load of already overwhelmed staff. This author agrees with Allhoff & Henschke that “security demands must be balanced against human capacities and workflow in order to achieve functional benefits” (2018, p. 60), and to build and support a cybersecurity culture, we need to find a way to engage our people that considers “the need of taking care of cognitive, emotional and social aspects” (Corradini, 2020, p. 60). This brings us back our initial premise of embedding security into the way we work and where opsec comes in.

3. Overview of operational security (opsec)

Opsec became an area of interest during the Vietnam War, when the U.S. military realised that the enemy was gaining knowledge of operations, despite their lack of technological or intelligence capability (Fruhlinger, 2019). They found that the enemy was gaining operational knowledge by observing predictable behaviour and taking advantage of poor discipline in handling communications (National Counterintelligence And Security Center, 2023a). Following this realisation, opsec was defined and implemented (Fruhlinger, 2019). In 1988, it became a U.S. government-wide requirement for any departments or agencies with a national security mission, and in 2021, became a requirement for all executive branch departments and agencies to implement opsec (Space Operations Command, 2023).

In parallel to military and security applications, opsec has matured to become a proven security discipline, more broadly used, to prevent the collection, analysis and exploitation of information (National Counterintelligence And Security Center, 2023a). This enables organisations to assess technological and non-technological processes to improve security (Fortinet, 2021) to improve their security posture (National Counterintelligence And Security Center, 2023a).

Its implementation involves five steps (NIST, 2020a): 

  1. identification of critical information
  2. analysis of threats
  3. analysis of vulnerabilities
  4. assessment of risks 
  5. application of appropriate countermeasures. 

Recalling the significant role of humans and being informed by the innate vulnerabilities outlined in section 2, this risk-based process should help us to mitigate these vulnerabilities. Opsec is embedded into business-as-usual activities on a day-to-day basis (Robins, 2017) which means it can help secure “the way we do things around here” (Lundy & Cowling, 1996, as cited in Corradini, 2020, p. 64) by tending to the human aspect of our socio-technical system, without adding cognitive load. By balancing our treatment of the human and technological elements of our socio-technical system, it should increase overall cybersecurity, but it leaves a question of what are “appropriate countermeasures”? In the next section, we will explore the application of zero-trust principles in an attempt to inform an answer. 

4. Overview of zero-trust

The prevalent perimeter approach to cybersecurity relies on creating a trust boundary between the untrusted outside world and the trusted internal network of an organisation. As we covered in our introduction presentation, once users are inside the perimeter, trust is often implied, permitting them, or adversaries, free movement within that perimeter. In an effort to prevent this movement, zero-trust focuses on “thorough and continuous verification” to create a secure environment on a network that is assumed to be breached (Sarkar et al., 2022, p. 9).

Let’s first review a brief history of zero-trust. Although “zero-trust” is a relatively recent phenomenon, viewed by some as a buzzword (Gill, 2023), it is a paradigm that has been developing since at least 2004:

  • 2004 the Jericho group introduced the concept of “de-perimeterization” (Open Group, 2007).
  • 2007 the Defense Information Systems Agency (DISA) published a software-defined perimeter called ‘black core’ (Lorenzin, 2022).
  • 2009 White made the case that security shouldn’t be treated as a fortress (2009).
  • 2010 the term zero-trust was first coined by John Kindervag, in Forrester Research’s ‘No More Chewy Centers: The Zero Trust Model Of Information Security’ (Sarkar et al., 2022, p. 13). Kindervag proposed that verifiable identity became “the core criterion for access” rather than network location (Lorenzin, 2022).
  • In 2020 the National Institute of Standards and Technology (NIST) published ‘SP 800-207 Zero-Trust Architecture’ as framework for establishing zero-trust architecture (Lorenzin, 2022).

This has a range of implications from a technological perspective and a number of frameworks and maturity models exist to aid organisations in its adoption, including the NIST ‘Zero Trust Framework’ (NIST, 2020c) and the Cyber and Information Security Agency (CISA) ‘Zero Trust Maturity Model’ (CISA, 2022). 

Zero-trust also represents a major cultural change for organisations (Department of Defense (USA), 2022), with non-technological and organisational considerations which may require changes to “business processes, architecture and operations” (Australian Signals Directorate, 2022) as well as a need to explicitly authenticate and authorise all users and workflows (NIST, 2020c, p. 5). The inclusion of cultural change and “all users and workflows” by definition includes non-technological processes as part of zero-trust adoption, this confirms a potential application of its principles in opsec.

5. Discussion

The goal of this project has evolved into identifying how to build a human-centred cybersecurity culture, which can embed cybersecurity in the way we do things without adding extraneous cognitive load to our people as they fulfil their roles. This was based on the premise that zero-trust principles applied to opsec could inform the way we do things to mitigate human error. A solution to the challenge of human factors in cyber security is beyond the scope of this project. However the application of opsec to building a cybersecurity culture is promising, with its risk-based approach being embedded in day-to-day operations.

While assessing the compatibility of opsec and zero-trust principles, alignment has been observed. This was both at a high level, with each being risk-based approaches there is alignment in their implementation (See Appendix B), as well as at a more granular level with an initial analysis of the application of zero-trust principles to opsec. This analysis along with examples of implementable actions are presented in Appendix C, while far from exhaustive, demonstrate that zero-trust principles can be applied to opsec to inform our question from Section 3 of “what are ‘appropriate countermeasures’?”. However, applying zero-trust principles to opsec is not a panacea to human factors in cybersecurity, it lacks defence-in-depth and leaving areas such as the risk assessment largely uncovered. This author sees potential in complementing zero-trust’s focus on continuous verification with principles such as separation of duties and dual control, while tools such as cyber kill chains and MITRE ATT&CK could inform the threat, vulnerability, risk elements of opsec. It is anticipated that developing an approach to implementable operation security will require further effort, to be outlined in the upcoming strategy proposal, which may generate both tangible and intangible benefits.

Tangible benefits could span monetary, business continuity, productivity and workforce planning. Considering that many cyber crimes are unrecognised and not reported (Voce & Morgan, 2023, p. 9) estimating costs of future cyber attacks is a somewhat arbitrary exercise. However, it can set a baseline to make decisions against. We can estimate the cost of a human factors driven data breach to CASA of AU$353,104 over a six month period. This calculated using Equation 1, and given:

  • AU$3.9M is the average cost of a public sector sector data breach (IBM, 2023, p. 13)
  • 36 Notifiable Data Breaches which have human factors causes, reported by the Federal Government in the last six month reporting period (OAIC, 2024)
  • 350,300 Federal Government employees (Australian Bureau of Statistics, 2023)
  • 881 CASA staff (Transparency Portal, 2023). 

Equation 1: Estimating CASA cost of data breaches caused by human error

As well as being among the most costly, breaches caused by human factors are also among the longest to resolve (IBM, 2023, pp. 20 – 21). Mitigating them may provide potential savings and improved business continuity – further enhanced by the risk assessment required for opsec, by identifying redundancies for use in the event of a disruption. Productivity improvements have been observed following the optimisation of socio-technical systems (Walker et al., 2007, p. 627). They have also been observed due to the reduction of human error as well as from documentation and potential automation (Oberle, 2023, p. 589) which may result from implementing opsec. There could also be benefits in workforce planning by applying an aligned paradigm across all information security, whether technological or non-technological. This could improve working relationships by facilitating a common language giving improved cross-team collaboration, potential flexibility in staff deployment and improved coordination during incident response.

Intangible benefits could include developing a cybersecurity culture as well as strategic alignment and benefits. In creating a cybersecurity culture, CASA may become an exemplar to industry and other agencies. In creating good cyber-hygiene habits these may spill into staffs’ private lives, reducing the chances of them being scammed, contributing to a more secure society. Building a cybersecurity culture in a manner that minimises additional load on staff may contribute to mitigating two of CASA’s strategic risk areas outlined in our corporate plan (CASA, 2023a, p. 24): 

  • CASA fails to meet work health and safety obligations
  • CASA is unable to prevent and respond to a cybersecurity event

PSPF Direction 002-2024 requires government entities to “conduct a technology asset stocktake on all internet-facing systems or services” (Dept Home Affairs, 2024, p. 1). Undertaking an opsec analysis would help ensure thorough adherence to this direction by identifying human connected systems that were previously excluded from system audits.

Finally, we may also see strategic benefits more broadly across: competitive advantage, in the case of government agencies these can include innovation, reputation, organisational relationships and strategic assets (Matthews & Shulman, 2005); feedback about problems as well as about stakeholder risks; improved due diligence; reduced costs; and a growth in trust with partners and industry (Ezingeard et al, 2004; Zhang & Suhong, 2006, as cited in White, 2009).

6. Conclusion

During the presentations preceding this report we questioned the concept of the ‘zero-trust culture’ called for in the Australian Cyber Security Strategy, explored the concept of organisational culture, recognised the prevalence of human factors in cybersecurity, introduced opsec and finally proposed that applying zero-trust principles to opsec could deliver the ‘zero-trust culture’ which may mitigate human factors. 

Following further investigation into human factors of cybersecurity and organisational culture, this turned into a quest to identify how to build a human-centred cybersecurity culture. During which we have established the importance of humans in cybersecurity and there has been a lack of focus on this human element, potentially at the expense of overall cybersecurity. Opsec could be a positive step towards this culture because it is embedded into business-as-usual to secure “the way we do things around here” which is a critical element of culture (Lundy & Cowling, 1996, as cited in Corradini, 2020, p. 64). Given that the scope of zero-trust includes non-technological considerations such as cultural change, business processes and operations we see an alignment with opsec which could aid making opsec implementable.

While there has been a demonstrated alignment of zero-trust principles and opsec which yielded tangible implementation actions, in order to generate defence-in-depth from an opsec implementation this author believes that further work is required. Areas of further investigation include: 

  • Analysis of cybersecurity principles for suitability of informing opsec controls
  • Using tools such as cyber kill chains and MITRE ATT&CK to inform the threat, vulnerability, risk elements of opsec
  • Investigation of Huang & Zhu’s “Kill Chain of Cognitive Attacks” (2023, p. 12), verifying it against existing kill chains and extending it to include organisation factors which trigger human factor driven cyber incidents.

The need for a focus on the human element of cybersecurity was a recurring theme during research, anecdotally however, it was noted by the author that the solutions seemed to be the application of technology such as AI to ‘solve’ the human problem. The author believes that there is further research opportunity into the area of human-centred cybersecurity culture, both its development, implementation and the benefits of such a culture, and that existing research across a number of fields could inform this work.

The strategy proposal to follow, will outline how CASA can begin our journey to build a human-centred cybersecurity culture by implementing, for lack of a better phrase, cybersecurity aligned opsec. This will support existing training, awareness and proven technological controls, by mitigating the hitherto neglected innate human cognitive vulnerabilities. Additional to the benefits outlined in this report, doing so will align with our 2023-24 corporate plan by increasing confidence in the management of our strategic risk area of cybersecurity (CASA, 2023a, p. 24) as well as increasing our contribution to standards for cybersecurity (CASA, 2023a, p. 23).

References

Allhoff, F., & Henschke, A. (2018). The Internet of Things: Foundational ethical issues. Internet of Things, 1–2, 55–66. https://doi.org/10.1016/j.iot.2018.08.005

Alvero, K. (2021, August 27). What Role do Humans Play in Ensuring Cybersecurity? ISACA. https://www.isaca.org/resources/isaca-journal/issues/2021/volume-5/what-role-do-humans-play-in-ensuring-cybersecurity

Annaswamy, A. M., Khargonekar, P. P., Lamnabhi-Lagarrigue, F., & Spurgeon, S. K. (2023). Cyber-Physical-Human systems: Fundamentals and applications. John Wiley & Sons.

Annaswamy, A. M., & Yildiz, Y. (2021). Cyber-Physical-Human systems. In Encyclopedia of Systems and Control (pp. 497–508). Springer International Publishing. http://dx.doi.org/10.1007/978-3-030-44184-5_100113

Australian Bureau of Statistics. (2023, November 9). Public sector employment and earnings, 2022-23 financial year. Australian Bureau of Statistics. https://www.abs.gov.au/statistics/labour/employment-and-unemployment/public-sector-employment-and-earnings/latest-release

Australian Signals Directorate. (2022, July 29). Gateway security guidance package: Gateway technology guides. Cyber.Gov.Au. https://www.cyber.gov.au/resources-business-and-government/maintaining-devices-and-systems/system-hardening-and-administration/gateway-hardening/gateway-security-guidance-package-gateway-technology-guides

Brandt, J. (2021, November 1). Operational security: A business imperative. ISACA. https://www.isaca.org/resources/news-and-trends/industry-news/2021/operational-security-a-business-imperative

CASA. (2013). Safety behaviours human factors engineers resource guide. CASA. https://www.casa.gov.au/sites/default/files/2021-06/safety-behaviours-human-factors-engineers-resource-guide.pdf

CASA. (2022, March 23). Work health and safety policy statement. Civil Aviation Safety Authority. https://www.casa.gov.au/about-us/reporting-and-accountability/work-health-and-safety-policy-statement

CASA. (2023a). CORPORATE PLAN 2023–24. https://www.casa.gov.au/sites/default/files/2023-07/casa-corporate-plan-2023-24.pdf

CASA. (2023b, November 30). Australian Public Service Census 2023. Civil Aviation Safety Authority. https://www.casa.gov.au/resources-and-education/publications-and-resources/corporate-publications/australian-public-service-census#CASACensusActionPlan2023

CISA. (2022, January 19). Zero trust maturity model. Cybersecurity and Infrastructure Security Agency CISA. https://www.cisa.gov/zero-trust-maturity-model

Corradini, I. (2020). Building a cybersecurity culture in organizations. Springer International Publishing. http://dx.doi.org/10.1007/978-3-030-43999-6

Couretas, J. M. (2018). An introduction to cyber modeling and simulation. John Wiley & Sons.

Danet, D. (2023). Cognitive Security: Facing Cognitive Operations in Hybrid Warfare. 22nd European Conference on Cyber Warfare and Security.

Davis, M. C., Challenger, R., Jayewardene, D. N. W., & Clegg, C. W. (2014). Advancing socio-technical systems thinking: A call for bravery. Applied Ergonomics, 45(2), 171–180. https://doi.org/10.1016/j.apergo.2013.02.009

Defence Science and Technology Group. (2020, April 29). Information warfare. DST. https://www.dst.defence.gov.au/strategy/star-shots/information-warfare

Department of Defense (USA). (2022, October 21). DoD Zero Trust Strategy. https://dodcio.defense.gov/Portals/0/Documents/Library/DoD-ZTStrategy.pdf

Department of Parliamentary Services. (2024, July). Department of Parliamentary Services hiring Director, Cyber Intelligence and Assurance in Canberra, Australian Capital Territory, Australia. LinkedIn. https://au.linkedin.com/jobs/view/director-cyber-intelligence-and-assurance-at-department-of-parliamentary-services-3972825768

Dept Home Affairs. (2023a). 2023-2030 Australian cyber security strategy discussion paper. https://www.homeaffairs.gov.au/reports-and-pubs/files/2023-2030_australian_cyber_security_strategy_discussion_paper.pdf

Dept Home Affairs. (2023b). 2023-2030 Australian cyber security strategy. https://www.homeaffairs.gov.au/cyber-security-subsite/files/2023-cyber-security-strategy.pdf

Dept Home Affairs. (2023c). 2023–2030 Australian Cyber Security Strategy ACTION PLAN. https://www.homeaffairs.gov.au/cyber-security-subsite/files/2023-cyber-security-strategy-action-plan.pdf

Dept Home Affairs. (2024, July 5). PSPF Direction 002-2024. Protective Security. https://www.protectivesecurity.gov.au/system/files/2024-07/PSPF%20Direction%20002-2024.pdf

Dudley-Nicholson, J. (2023, November 29). Criminals target government with record cyber attacks. The Mandarin. https://www.themandarin.com.au/235644-criminals-target-government-with-record-cyber-attacks/

Fantini, P., Pinzone, M., & Taisch, M. (2020). Placing the operator at the centre of Industry 4.0 design: Modelling and assessing human activities within cyber-physical systems. Computers & Industrial Engineering, 139, 105058. https://doi.org/10.1016/j.cie.2018.01.025

Fortinet. (2021, February 17). What is OPSEC in cybersecurity? Fortinet. https://www.fortinet.com/resources/cyberglossary/operational-security

Fruhlinger, J. (2019, May 8). What is OPSEC? How operations security protects critical information. CSO Online. https://www.csoonline.com/article/567199/what-is-opsec-a-process-for-protecting-critical-information.html

Gill, J. (2023, April 24). Zero Trust is the Pentagon’s new cyber buzzword. It might not have stopped the Discord leaks. Breaking Defense. https://breakingdefense.com/2023/04/zero-trust-is-the-pentagons-new-cyber-buzzword-it-might-not-have-stopped-the-discord-leaks/

Hally, L. (2024a). Building a zero-trust culture. https://youtu.be/irYuBctWZ9A

Hally, L. (2024b). Human element of cybersecurity. https://www.youtube.com/watch?v=dFpzVdDkEZo

Hancock, J. (2022). Psychology of human error 2022. Tessian; Tessian. https://f.hubspotusercontent20.net/hubfs/1670277/%5BCollateral%5D%20Tessian-Research-Reports/%5BTessian%20Research%5D%20Psychology%20of%20Human%20Error%202022.pdf?__hstc=170273983.4b8f21278d1f55eb244857b20a902ab7.1720433379581.1720433379581.1720433379581.1&__hssc=170273983.1.1720433379581&__hsfp=1336845441&hsCtaTracking=8cc3440e-eb09-43bc-962f-51403868c8e0%7C37fefaef-476b-4bea-b78b-6a469a28172a 

Huang, L., & Zhu, Q. (2023). Cognitive security: A system-scientific approach. Springer Nature.

IBM. (2023). Cost of a Data Breach Report 2023. IBM. https://www.ibm.com/downloads/cas/E3G5JMBP

Klauzner , I., & Pisani, A. (2023). Trends in and characteristics of cybercrime in NSW. In Crime  and justice  statistics  bureau brief. https://www.bocsar.nsw.gov.au/Publications/BB/BB165-Report-Cybercrime-in-NSW.pdf

Lorenzin, L. (2022, April 20). A brief(er) history of zero trust: Major milestones in rethinking enterprise security. CXO Revolutionaries. https://www.zscaler.com/cxorevolutionaries/insights/briefer-history-zero-trust-major-milestones-rethinking-enterprise-security

Maguire, D. (2024, July 11). Scammers are using our scam awareness against us — so watch out for unsolicited offers of help. ABC News. https://www.abc.net.au/news/2024-07-12/scammers-impersonating-scamwatch/104052282

Malatji, M., Von Solms, S., & Marnewick, A. (2019). Socio-technical systems cybersecurity framework. Information & Computer Security, 27(2), 233–272. https://doi.org/10.1108/ics-03-2018-0031

Matthews, J., & Shulman, A. D. (2005). Competitive advantage in public-sector organizations: Explaining the public good/sustainable competitive advantage paradox. Journal of Business Research, 58(2), 232–240. https://doi.org/10.1016/s0148-2963(02)00498-8

Microsoft. (2023). Zero trust model – Modern security architecture. Microsoft Security. https://www.microsoft.com/en-au/security/business/zero-trust

National Counterintelligence And Security Center. (2023a). Enterprise Risk Mitigation Blueprint for Non-Intelligence Agencies. https://www.dni.gov/files/NCSC/documents/products/Risk_Mitigation_Web_2023.pdf

National Counterintelligence And Security Center. (2023b). Understanding OPSEC. https://www.dni.gov/files/NCSC/documents/nittf/Understanding_OPSEC_Bulletin_1.pdf

NIST. (2016, May 24). Cybersecurity framework. CSRC. https://csrc.nist.gov/Projects/cybersecurity-framework/Filters#/csf/filters

NIST. (2017). Framework for cyber-physical systems: Volume 1, overview. National Institute of Standards and Technology. http://dx.doi.org/10.6028/nist.sp.1500-201

NIST. (2020a, April 20). operations security (OPSEC) – Glossary. CSRC. https://csrc.nist.gov/glossary/term/operations_security

NIST. (2020b, August 10). Zero trust architecture. NIST. https://www.nist.gov/publications/zero-trust-architecture

NIST. (2020c). Zero trust architecture. National Institute of Standards and Technology. https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-207.pdf

NIST. (2024). The NIST cybersecurity framework (CSF) 2.0. National Institute of Standards and Technology. http://dx.doi.org/10.6028/nist.cswp.29

OAIC. (2024, February 22). Notifiable data breaches report: July to december 2023. OAIC. https://www.oaic.gov.au/privacy/notifiable-data-breaches/notifiable-data-breaches-publications/notifiable-data-breaches-report-july-to-december-2023

Oberle, L. J. (2023). How to build responsive service processes in German banks: The role of process documentation and the myth of automation. Business Process Management Journal, 29(2), 578–596. https://doi.org/10.1108/bpmj-11-2022-0573

Office Of The Director Of National Intelligence. (2021). National Operations Security Program. https://www.dni.gov/files/NCSC/documents/nittf/NCSC_Memo_National_Security_Operations_Security.pdf

Open Group. (2007). Jericho ForumTM Commandments.

Prabhu, S., & Thompson, N. (2021). A primer on insider threats in cybersecurity. Information Security Journal: A Global Perspective, 31(5), 602–611. https://doi.org/10.1080/19393555.2021.1971802

Robins, C. (2017, July 25). Operations security. Marine Corps Installations East. https://www.mcieast.marines.mil/opsec/

Sarkar, S., Choudhary, G., Shandilya, S. K., Hussain, A., & Kim, H. (2022). Security of zero trust networks in cloud computing: A comparative review. Sustainability, 14(18), 11213. https://doi.org/10.3390/su141811213

Space Operations Command. (2023, January 5). OPSEC history: From ancient origins to modern challenges. Space Operations Command (SpOC). https://www.spoc.spaceforce.mil/News/Article-Display/Article/3260002/opsec-history-from-ancient-origins-to-modern-challenges

State Services Authority. (2013). Organisational  culture. https://vpsc.vic.gov.au/wp-content/uploads/2015/03/Organisational-Culture_Web.pdf

The Hoover Institution. (2020). Global Engagement RETHINKING RISK IN THE RESEARCH ENTERPRISE. https://www.hoover.org/sites/default/files/research/docs/tiffert_globalengagement_full_0818.pdf

Transparency Portal. (2023). Civil Aviation Safety Authority Annual Report 2022-23. Transparency Portal. https://www.transparency.gov.au/publications/infrastructure-transport-cities-and-regional-development/civil-aviation-safety-authority/civil-aviation-safety-authority-annual-report-2022-23/part-4%3A-people/people-management

UNSW. (2022, April 8). 1.2 The information environment. Cyber Operations. https://moodle.telt.unsw.edu.au/mod/page/view.php?id=4523357

U.S. Government Accountability Office. (2022). GAO-22-104714, information environment: Opportunities and threats to DOD’s national security mission. https://www.gao.gov/assets/gao-22-104714.pdf

Voce, I., & Morgan, A. (2023). Cybercrime in Australia 2023. Australian Institute of Criminology. http://dx.doi.org/10.52922/sr77031

Walker, G. H., Stanton, N. A., Jenkins, D., Salmon, P., Young, M., & Aujla, A. (2007). Sociotechnical theory and NEC system design. In Engineering Psychology and Cognitive Ergonomics (pp. 619–628). Springer Berlin Heidelberg. http://dx.doi.org/10.1007/978-3-540-73331-7_68

White, G. L. (2024). Security literacy model for strategic, tactical, & operational management levels. Information Security Journal: A Global Perspective, 1–9. https://doi.org/10.1080/19393555.2024.2307632

Appendix A: Alignment of CPHS definition

This table maps the National Institute of Standards and Technology (NIST) Cyber-Physical System definition (NIST, 2017, p. 4) to CASA as demonstration of appropriateness of CPHS to assess the human element at CASA.

CPS definition elementCASA alignment
ComputationAs per CPS definition
CommunicationAs per CPS definition
SensingAs we move towards a zero-trust environment our devices will act as sensors for our authentication, sensing attributes such as location, device health, biometrics, behaviours (CISA, 2022).
ActuationWhile not a physical actuator, the output of our operations include authorisations and instruments which actuate events in the physical world.
Physical systemsThis can include printers or displays which could be used to breach confidentiality. Extending on our ‘actuation’ it could also include aircraft and other physical systems we approve for operation in industry.
Fulfil time-sensitive functionsOur operations run to schedules and service level agreements.
Varying degrees of interaction with the environment, including human interactionAs per in ‘sensing’, ‘actuation’ and ‘physical systems’. We also have humans involved throughout process life cycles.

Appendix B: Mapping zero-trust and opsec implementation steps

opsec (NIST, 2020a)Zero-trust (NIST, 2020c, p. 37)
Identification of critical informationAssessment (system inventory, user inventory, business process review)
Analysis of threatsRisk assessment and policy development
Analysis of vulnerabilities
Assessment of risks
Application of appropriate countermeasuresDeployment

Appendix C: Mapping zero-trust principles to opsec

Initial mapping of opsec steps to NIST zero-trust tenets with example implementable actions. The actual actions will be influenced by the use of technology to achieve the desired outcome, this may be informed by maturity models such as the Global Engagement Maturity Model (The Hoover Institution, 2020, p. 130). This will be elaborated in the upcoming strategy proposal.

Rowopsec (NIST, 2020a)Applicable zero-trust tenet (NIST, 2020c, 6-7)Example implementable action
1Identification of critical informationResources include all data sources, services and devices.• Leverage the data inventory work using ‘Valued Data Item’ classification to identify critical information. Note that resources will also include any workflows, processes, device, system which the Valued Data Item is used in or by.• This will include documentation of people with access requirements with detail of when, privilege level and purpose, informed and verified by the data custodian/steward.
2Analysis of threatsRows 2, 3, 4 relate to cybersecurity risk assessment. While zero-trust principles don’t speak directly to this, it is noted that risk management frameworks are compatible with zero-trust (NIST, 2020c, p. 32).
3Analysis of vulnerabilities
4Assessment of risks
5Application of appropriate countermeasuresResource access is granted on a per-session, per-resource and least-privilege basisPeople are granted enough privilege to complete the task. Once their task is completed, their access is revoked. This is verified after granting/revoking.
6Dynamic policy determines resource access, using identity, application/service, and devices data points and potentially other behavioural and environmental attributes.Leverage ITB zero-trust project deliverables for this.
7The integrity and security posture of all assets are measuredDocument/log assets including their value, controls, asset change history, lifecycle audit trail and risk assessment.
8All communication is secured regardless of whether it is internal or external• Use encrypted channels for digital communication• Leverage the information from row 1 to ensure communications are to the correct person• Delivery receipts enabled or recipients must confirm receipt of communications.
9Before resource access is granted, dynamic authentication and authorisation are strictly enforcedLeverage the information from row 1 to determine asset access requirements and eligibility.
10As much information as possible is collected about assets, network infrastructure and communications to inform security improvementsReview logs from row 7 to inform security improvements.

Recent posts