Cybersecurity Culture: A Definition and Maturity Model For Critical Infrastructure

“If you get the culture right, most of the other stuff will just take care of itself.” – Tony Hsieh, CEO of Zappos

Abstract

As the world becomes more interconnected, cybersecurity continues to be a growing concern in critical infrastructure environments. Neither policies and procedures nor technical solutions can consider all possible scenarios; however, an environment where cybersecurity is considered the norm and all users are engaged in protecting systems may be able to mitigate these limitations. Organizational culture has been proposed as a means by which desired behavioural outcomes can be achieved. We investigate current applications of Schein’s (1985) three-part model to information security culture and safety culture, and propose a new definition of a culture of cybersecurity. Leveraging these insights and drawing on the concept of maturity models, we investigate whether it is possible to draft a cybersecurity culture maturity model. If successful, this would represent a contribution in the field of cybersecurity, as it could draw together multiple concepts from information security, safety culture, and management literature related to organizational culture, in order to address an issue relevant to top management teams and those in governance positions. It would also be of interest to national policymakers as it highlights some of the inconsistencies with the current practices regarding cybersecurity in critical infrastructure.

Introduction

Culture has been defined as a “system of shared values (defining what is important) and norms (defining appropriate attitudes and behaviors)” (Chatman & Cha, 2003: 21). Adapted from anthropology for organization management research, the concept of organizational culture has been identified as a way to encourage desired behaviours within an organization (Chang & Lin, 2007). The most well-known and simplest definition of organizational culture is ‘‘the way things are done here’’ (Lundy & Cowling, 1996: 168); however, the concept of organizational culture is more complex and potentially more powerful when viewed in terms of the impact on behaviour within an organization.  The recent trend to support the priority area of cybersecurity is no exception; there have been recent calls to create a culture of cybersecurity (Muegge & Craigen, 2015). It is believed that by generating a culture of cybersecurity that “groups and individuals could practice safe computing and would expect others to do so. IT systems would be promptly patched, and secure best practices would be the norm” (Muegge & Craigen, 2015: 13). In order to determine whether or not this is realistic, it is worth examining areas where similar calls-to-action regarding developing a “culture of” has occurred, including safety and information security.

Information Security Culture (ISC) has been a topic of research since the turn of this century, whereby researchers have put forward a litany of definitions of ISC, theories on the roots of ISC, and frameworks for cultivating ISC. Karlsson et al. (2015) identify that there has been minimal research, however, into the fruits of ISC. By contrast, Safety Culture has articles dating back as far as the early 1950s (Keenan et al., 1951); this concept is comparatively mature and extensively researched. Although the empirical research on the topic has progressed, theory has remained relatively stagnant for the past 20 years (Guldenmund, 2000). Despite the stagnation, a number of Safety Culture Maturity Models have emerged and have been applied to safety culture development within a number of “high hazard” industries such as aviation, rail, and petrochemical industries (Foster & Hoult, 2013). While Cybersecurity Capability Maturity Models have been successful in improving the “maturity and efficacy of controls employed to secure critical infrastructures” (Miron & Muita, 2014: 34), an equivalent Maturity Model for Cybersecurity Culture has not been developed – and it is uncertain whether the development of such a solution would be beneficial.

In this article, we investigate whether a cybersecurity culture maturity model would provide managers with a tool to assess the cybersecurity culture within their organization. If successful, this would represent a contribution in the field of cybersecurity, as it could draw together multiple concepts from information security, safety culture, and management literature related to organizational culture, in order to address an issue relevant to top management teams and those in governance positions. It would also be of interest to national policymakers as it highlights some of the inconsistencies with the current practices regarding cybersecurity in critical infrastructure.

This document is organized as follows. In the first section, we provide a literature review outlining recent relevant literature associated with culture, organizational culture, information security culture, safety culture and maturity models. In the second section, we provide our preferred construct for the term cybersecurity and present a new definition for cybersecurity culture. In the third section, we relate our definition to that of the original definition provided by Schein (1985) regarding culture (espoused values, shared tacit assumptions, and artifacts) and the extension provided by Von Solms and Van Niekerk (2010) (knowledge). In the fourth section, we propose a five-step cybersecurity maturity model, discussing the strengths and weaknesses of this approach. In the final sections, we present our conclusions as well as areas for further research.

Literature Review

Organizational Culture

Culture is traditionally defined as consisting of “the abstract values, beliefs, and perceptions of the world that lie behind people’s behavior and that are reflected by their behaviour” (Haviland et al., 2009: 33). The concept of culture has subsequently been adopted by management literature under the label of organizational culture (or “corporate culture”), wherein culture is believed to influence the behaviour of employees and contribute to the effectiveness of an organization (Thomson & von Solms, 2006). Acceptable and unacceptable conduct within the context of corporate culture is determined through policies and procedures (Whitman & Mattord, 2003:194). Schein’s model (1985) has considerable influence in this domain, whereby organizational culture is defined as: “a pattern of shared tacit assumptions that was learned by a group as it solved its problems of external adaptation and internal integration, that has worked well enough to be considered valid and, therefore, to be taught to new members as the correct way to perceive, think, and feel in relation to those problems” (1985: 27).  Schein (1985) describes culture as existing in three levels: artifacts, espoused values and shared tacit assumptions. In later papers he defined these as:

the level of deep tacit assumptions that are the essence of the culture, the level of espoused values that often reflect what a group wishes ideally to be and the way it wants to present itself publicly, and the day-to-day behavior that represents a complex compromise among the espoused values, the deeper assumptions, and the immediate requirements of the situation (Schein, 1996: 9).

It is widely believed that organizational culture consists of various subcultures that influence the overall culture, such as professional groups (Ramachandran, Rao, Goles & Dhillon, 2013) as well as national cultures (Helmreich & Merrit, 2001).

Safety Culture

Safety culture and safety climate constructs have been extensively discussed in academic literature; the earliest located paper on safety climate is Keenan et al. (1951) though more recent works include Guldenmund (2000) who conducts a comprehensive review and identifies 17 distinct definitions of safety culture (and the similar concept of safety climate) of which the majority are “global and therefore implicit” (2000:229). In 1991, the report on safety culture by the International Nuclear Safety Advisory Group indicated that safety culture consists of two elements: (1) a necessary framework within an organisation, and (2) the attitude of staff at all levels in responding to and benefitting from the framework (INSAG, 1991.).  The 1993 Advisory Committee on the Safety of Nuclear Installations (ACSNI) definition (subsequently adopted by Lee in 1996) is the most explicit, defining safety culture as:

the product of individual and group values, attitudes, perceptions, competencies, and patterns of behaviour that determine the commitment to, and the style and proficiency of, an organisation’s health and safety management.

Multiple authors call for an integrative framework that takes into account the organizational culture as a whole (Gordon et al., 2007; Guildenmund, 2000).

Information Security Culture

Information Security Culture (ISC) has been a topic of research since the turn of this century, whereby researchers have put forward various definitions of ISC, theories on the roots ISC, and frameworks for cultivating ISC (Karlsson et al., 2015). VanNiekerk & Von Solms (2010) adapted Schein’s 1985 three-level model of culture and added a fourth level – information security knowledge – which supports the other three levels. ISC is defined as:

the attitudes, assumptions, beliefs, values and knowledge that employees/stakeholders use to interact  with the organisation’s systems and procedures at any point in time. The interaction results in acceptable or unacceptable behaviour (i.e. incidents) evident in artifacts and creations that become part of the way things are done in the organisation to protect its information assets. This information security culture changes over time. (Da Veiga & Eloff, 2010:198).

Karlsson et al. (2015) provide an excellent summary of the state-of-the-art in relation to information security culture research and point out that minimal research has demonstrated the fruits of Information Security Culture. The frameworks that exist for measuring ISC, of which Da Veiga and Eloff (2010) is the most comprehensive and extensively cited, are often unnecessarily complex and have not been widely adopted by industry. 

Summary

Organizational culture is complex, with multiple variables influencing the overall culture of an organization (profession, nationality, etc.). The concept of organizational culture has been adopted to help with developing desirable traits, such as an information security culture or a safety culture.  In more recent literature (Muegge & Craigen, 2015) there has been a call for a Culture of Cybersecurity. There is minimal literature in relation to a culture of cybersecurity; the nearest parallels can be drawn instead from information security culture or a safety culture. In the following section, we seek to generate a definition for a culture of cybersecurity and explore whether this can be converted into an employable management tool.

Cybersecurity Culture – Background & Definition

Defining Cybersecurity Culture

The terms information security and cybersecurity are often used interchangeably; however, Von Solms & Van Niekerk (2013) suggest they are not analogous – namely, some cybersecurity threats do not form part of the formally defined scope of information security. Whitman & Mattord (2009) define information security as “the protection of information and its critical elements, including the systems and hardware that use, store, and transmit that information” (Whitman & Mattord, 2009: 8). The international standard, ISO/IEC 27002 (2005), defines information security as “the preservation of the confidentiality, integrity and availability of information” whereby information can take on many forms: printed or written on paper, stored electronically, transmitted by post or electronic means, shown on films, conveyed in conversation, and so forth” (2005: 1). In settings where there are historical information archives that have not yet been digitized, this includes archival information. By contrast, cybersecurity has several “highly variable, often subjective, and at times, uninformative” definitions currently in circulation (Craigen et al., 2014:13). The preferred definition of cybersecurity is the construct upon which a cybersecurity culture can be developed & defined is that of the International Telecommunications Union (ITU):

Cybersecurity is the collection of tools, policies, security concepts, security safeguards, guidelines, risk management approaches, actions, training, best practices, assurance and technologies that can be used to protect the cyber environment and organization and user’s assets. Organization and user’s assets include connected computing devices, personnel, infrastructure, applications, services, telecommunications systems, and the totality of transmitted and/or stored information in the cyber environment. Cybersecurity strives to ensure the attainment and maintenance of the security properties of the organization and user’s assets against relevant security risks in the cyber environment. The general security objectives comprise the following: Availability, Integrity, which may include authenticity and non-repudiation, and Confidentiality (ITU, 2008). 

Next, it’s important to understand pre-existing definitions regarding organizational culture. The original adaptation of Schein’s (1985) concept of organizational (corporate) culture by Van Niekerk & Von Solms (2010) is as follows:

 

Picture5
Figure 1–  Levels of Information Security Culture. Von Solms & Van Niekerk (2010) adapted from Schein (1999).

For this paper, we believe that an adaptation of the definition of safety culture that aligns with the both Schein (1985) and Von Solms & Van Niekerk (2010) is most appropriate. These source definitions can be summarized as follows:

Author Term Definition
Schein (1996) Organizational Culture A pattern of shared tacit assumptions that was learned by a group as it solved its problems of external adaptation and internal integration, which has worked well enough to be considered valid and, therefore, to be taught to new members as the correct way to perceive, think, and feel in relation to those problems.
ACSNI (1993), Lee (1996) Safety Culture The product of individual and group values, attitudes, perceptions, competencies, and patterns of behaviour that determine the commitment to, and the style and proficiency of, an organisation’s health and safety management.
Da Veiga & Eloff (2010) Information Security Culture The   attitudes, assumptions, beliefs, values   and knowledge that employees/stakeholders use to interact with  the organisation’s  systems  and  procedures  at  any point  in  time.  The interaction results in acceptable or unacceptable behaviour (i.e. incidents) evident in artifacts and creations that become part of the way things are done in the organisation to protect its information assets. This information security culture changes over time.

 

We define cybersecurity culture as follows:

The attitudes, assumptions, beliefs, values and knowledge of individuals and groups that determine the commitment to, and proficiency of, an organization’s cybersecurity management at any point in time.

This definition includes the three-part definition of culture provided by Schein (1985), and is sufficiently broad as to encapsulate the additional fourth part (knowledge) provided by Van Niekerk & Von Solms (2010) and draws on the Lee (1996) definition of safety culture.

Okere et al. (2012) summarize and explain the concepts of cybersecurity culture, namely artifacts, espoused values, tacit assumptions, and knowledge in relation to information security; they can be adapted to cybersecurity as follows:

  • Artifacts: While this level is visible, it is difficult to interpret without additional insight from “insiders” within the organization. Examples include: cybersecurity handbooks, awareness courses, policies, choice of technology, behavior patterns, and language usage.
  • Espoused values: This layer is partially visible and unspoken, representing the purpose given by insiders of an organization regarding the observed artifacts. Examples include: documents that describe the values, principles and vision of the organization.
  • Shared tacit assumptions: If the organization continues to be successful, the beliefs and values become shared and taken for granted and form the core of the organization’s culture. No examples are provided, as by definition tacit assumptions are not outwardly expressed.
  • Knowledge: Contributing to the above levels, knowledge is believed to help in ensuring cybersecurity compliance. Examples of knowledge include: competence, knowledge sharing practices, access to internal and external resources regarding cybersecurity, learnings from previous cybersecurity events.

It is against this definition that we endeavored to determine whether the development of a cybersecurity culture maturity model for use within and across critical infrastructure sectors is possible.

Developing a Cybersecurity Culture Maturity Model

The objective of this paper is to explore whether a cybersecurity culture maturity model appropriate for critical infrastructure is possible. In this section, we explore the concept of cybersecurity capability maturity models, safety culture maturity models, and the potential for an amalgamation of the two models in order to encapsulate culture into existing capability models for cybersecurity. In particular, we focused on safety culture literature as the “increasing pervasiveness of cyber-based systems and infrastructure, and society’s growing reliance on them, shifts the perspective concerning their proper operation from security to safety” (Dhouba et al., 2014:41). Many researchers contend that security properties are included in safety properties (Leveson, 2013; Young & Leveson, 2013). Safety is often associated with minimizing unintended disruption, and security is often associated with minimizing intended disruption; both concepts affect the proper operation of cyber-based systems and infrastructure (Dhouba et al., 2014:41). From this research, we intend to integrate the key elements of the selected safety culture maturity model(s) with cybersecurity requirements.

Cybersecurity Capability Maturity Models

The application of the Capability Maturity Model in relation to software was first published in 1993 as a means by which organizations could measure the differences between immature and mature organizations (Paulk, Curtis, Chrissis, & Weber, 1993); this concept has since been adopted by a multiplicity of domains that seek to have a logical progression monitored and measured. In cybersecurity research, Cybersecurity Capability Maturity Models (CCMMs) are plentiful, providing well-defined criteria against which managers can measure the maturity of their preparedness against cyber-threats (Miron, 2015; Miron & Muira, 2014); however, navigating the many standards is complicated, and has both time and cost implications (Miron & Muita, 2014).

While well-defined criteria against which managers can measure the maturity of their cybersecurity capabilities exist, frameworks to help evaluate information security and/or cybersecurity culture are minimal and difficult to navigate. No maturity models for measuring the maturity of an organizations’ cybersecurity culture were available at the time of writing (2016) in academic, peer-reviewed literature; perhaps because of the normative assumptions underlying a maturity model geared towards assessing culture.

Safety Culture Maturity Models

A number of Safety Culture Maturity Models (referred to herein as SCMMs) have emerged in recent decades within the Critical Infrastructures space (Foster & Hoult, 2013). SCMMs have directly originated from a mixture of organizational development and other capability maturity models used in the software industry in efforts to help organizations assess their existing safety cultures. Theoretically, SCMMs identify which actions must be taken by the organization in order to improve the organization’s culture (Filho et al., 2013). In this context, it is important to note that “maturity” refers specifically to the human elements and behaviours within the culture of an organization, not the maturity of the safety management systems themselves (Foster & Hoult, 2013).There are four well-cited Safety Culture Maturity Models found in the literature: Westrum (1993; 2004), Reason (1993), Fleming (2000), and Hudson (2001) / Parker, Lawrie, Hudson (2006). Each model described above demonstrates the importance of a “strong relationship between the culture of an organization and the development of a systems approach” (Foster & Hoult, 2013: 63). However, “no available data indicates that all organisations follow a sequential maturation and also that the use of averages to determine the level of maturity is appropriate” (Filho et al., 2013: 616). There are concerns that a maturity model for evaluating culture may be normative and distracting from the overall purpose of developing an effective organizational culture (which, by extension, would ensure safety and cybersecurity). As such, the development of a “culture of cybersecurity” and a “cybersecurity capability maturity model” should be viewed as secondary to the overall development of an effective organization that prioritizes both safety and cybersecurity as part of their risk management strategy.

Cybersecurity Culture Maturity Model

If a cybersecurity culture maturity model appropriate for use by top management teams was developed, it would be sensible to begin by reviewing pre-existing safety culture maturity models and proceed by adding additional insights generated through a literature review and interviews with industry experts. As we attempted this process, we began by building the model proposed by Reason (1997) in order show the progressive nature of building a cybersecurity culture: Pathological, Reactive, Bureaucratic, Proactive, and Generative. These five levels were renamed to increase “the accessibility of the framework to industry employees by including terms they would be familiar with” (Parker et al., 2006: 555) as follows: Vulnerable, Reactive, Compliant, Proactive, and Resilient. The five levels align with the Minerals Industry Risk Management (MIRM) Maturity Chart; a Maturity Model published by a team from the University of Queensland (2008) based on Hudson (2001). We adopted and modified the characterizations provided by Parker et al. (2006) in order to characterize in terms of cybersecurity using the following phrases:

Cybersecurity Culture Maturity Model
Figure 2: Cybersecurity culture maturity model levels (Lance and Bacic, 2016, based on Reason, 1997; Parker et al., 2006)

It is important to note that the progression assumes that there is a desired culture against which progression can be measured (Resilient). Culture is significantly more complex; however, a starting point against which managers can begin to take proactive steps may be of use. Ultimately, the risk tolerance of each organization is different; management teams will need to determine whether cybersecurity is appropriate in their domain and balance the needs of their organization with the larger implications of a cybersecurity incident in their critical infrastructure domain.

Contribution

Reid and Van Niekerk (2014) previously investigated the concept of cybersecurity culture and was deliberate in avoiding providing a prescriptive definition; instead, recommending that the definition of cybersecurity culture “should be defined to suit particular contexts.” To build on the call to action regarding the development of a culture of cybersecurity by Muegge and Craigen (2015), the construct of organizational culture was investigated in more depth and current applications of Schein’s (1985) model (information security culture and safety culture) were reviewed. As a result of our research, we propose a new definition of a culture of cybersecurity and put forward a draft cybersecurity culture maturity model step framework for verification in critical infrastructure environments. This draft model is by no means comprehensive; it requires empirical validation. As such, we encourage readers to review the proposed framework and test it within their environments to determine whether it assesses their cybersecurity culture and assists them in determining the next steps required to improve their cybersecurity. The use of such a tool may indirectly provide insight into whether or not an organization is effective in meeting its requirements regarding safety and cybersecurity as a whole.

Conclusion

This article proposes a potential means for increasing cybersecurity in critical infrastructure environments where all potential cybersecurity vulnerabilities cannot be predicted; the development of a cybersecurity culture. Through our research, we provide two contributions to theory. First, we provide an updated definition of cybersecurity culture that is appropriate for use across sectors in critical infrastructure. Second, we have created a draft Cybersecurity Culture Maturity Model for industry review. We have reviewed aspects of culture that are relevant to critical infrastructure, including: organizational culture, information security culture, safety culture, and the associated frameworks and capability maturity models. After reviewing the key elements and benefits of each, we selected and updated the most explicit information security culture (Da Veiga & Eloff, 2010) and safety culture definitions (Lee, 1996) for use as a baseline definition of cybersecurity culture. Further, after reviewing the associated frameworks, we generated a draft modified maturity model suitable for industry professionals to begin testing their cybersecurity culture. This model represents a possible starting point for top management teams to evaluate their culture and associated best practices.

There are limitations to this paper; namely, the cybersecurity culture maturity model is a draft requiring verification. Furthermore, it would be worthwhile expanding the CCMM to include an audit instrument, conducting audits using the instrument in critical infrastructure environments, and comparing the results from multiple audits in order to identify trends that transect critical infrastructure fields. We encourage readers to verify the model in their environments and provide feedback, such that a useful tool for cybersercurity may be generated.

Biography

This working draft paper was co-written by Ariana Bacic.

References

Advisory Committee on the Safety of Nuclear Installations (ACSNI). 1993. Study Group on Human Factors, Third Report: Organising for Safety. HSMO, London.

Chang, S. E., & Lin, C. 2007. Exploring organizational culture for information security management. Industrial Management & Data Systems, 107(3): 438–458

Chatman, J. A., & Cha, S. E. 2003. Leading by leveraging culture. California Management Review, 45(4): 20–34.

Craigen, D. 2014. Assessing Scientific Contributions: A Proposed Framework and Its Application to Cybersecurity. Technology Innovation Management Review, 4(11): 5–13.

Da Veiga, A., & Eloff, J. H. P. 2010. A framework and assessment instrument for information security culture. Computers & Security, 29(2): 196–207.

Da Veiga A, Martins N, Eloff JHP. 2007. Information security culture – validation of an assessment instrument. Southern African Business Review, 11(1):146–66.

Douba, N.Rütten, B.Scheidl, D.Soble, P., & Walsh, D. ’A. 2014. Safety in the Online World of the Future. Technology Innovation Management Review, 4(11): 41-48. http://timreview.ca/article/849

Filho, A. P. G., Andrade, J. C. S., & Marinho, M. M. de O. 2010. A safety culture maturity model for petrochemical companies in Brazil. Safety Science, 48(5): 615–624.

Fleming, M. 2007. Developing safety culture measurement tools and techniques based on site audits rather than questionnaires. Saint Mary’s University. http://pr-ac.ca/files/files/133_FinalReport_12May07.pdf.

Foster, P & Hoult, S. 2013. The Safety Journey: Using a Safety Maturity Model for Safety Planning and Assurance in the UK Coal Mining Industry. Minerals, 3: 59-72.

Gordon, R., Kirwan, B., Perrin, E., 2007. Measuring safety culture in a research and development centre: a comparison of two methods in the Air Traffic Management domain. Safety Science, 45: 669–695.

Guldenmund, F. W. 2000. The nature of safety culture: a review of theory and research. Safety Science, 34(1–3): 215–257.

Haviland, W., Fedorak, S. A., Lee, R. B. 2009. Cultural Anthropology, Third Canadian Edition. Ontario, Canada: Nelson Education.

Helmreich, R. L., & Merritt, A. R. L. 2001. Culture at work in aviation and medicine: National, organizational and professional influences. trid.trb.org.

Hudson, P. 2001. Aviation safety culture. Keynote address at Safeskies Conference: Canberra, November.

ISO27002. 2013. Information technology — Security techniques — Code of practice for information security controls. https://www.iso.org/obp/ui/#iso:std:iso-iec:27002:ed-2:v1:en, October 1.

International Nuclear Safety Advisory Group (INSAG). 1991. Safety Culture, Safety Series No 75- INSAG-4, Vienna:  International Atomic Energy Authority.

International Telecommunication Union. 2009. ITU Corporate Annual Report 2008. https://www.itu.int/osg/csd/stratplan/AR2008_web.pdf, April 22.

Karlsson, F., Åström, J., & Karlsson, M. 2015. Information security culture – state-of-the-art review between 2000 and 2013. Information and Computer Security, 23(3): 246–285

Keenan, V., Kerr, W., & Sherman, W. 1951. Psychological climate and accidents in an automotive plant. Journal of Applied Psychology, 35(2): 108-111.

Lee, T. R. 1996. Perceptions, attitudes and behaviour: the vital elements of a safety culture. Health and Safety Code Annotated of the State of California, Adopted April 7, 1939 / Annotated and Indexed by the Publisher’s Editorial Staff. California, 10: 1–15

Leveson, N. 2013. Engineering a Safer World. Cambridge, MA: MIT Press.

Lundy, O., & Cowling, A. 1996. Strategic Human Resource Strategy. London, Canada: Routledge

Martins, A., Eloff, J.H.P. 2002. Assessing Information Security Culture. ISSA Proceedings, Muldersdrift: South Africa.

Miron, W., & Muita, K. 2014. Cybersecurity Capability Maturity Models for Providers of Critical Infrastructure. Technology Innovation Management Review, 4(10): 33-39.

Miron, W. 2015. Adoption of Cybersecurity Capability Maturity Models in Municipal Governments. Unpublished Masters Dissertation. Carleton University, Ottawa.

Muegge, S. & Craigen, D. 2015. A Design Science Approach to Constructing Critical Infrastructure and Communicating Cybersecurity Risks. Technology Innovation Management Review, 5(6): 6-16. 

O’Donovan, G. 2006. The corporate culture handbook: how to plan, implement and measure a successful culture change. Ashbrook House, Ireland: The Liffey press.

Okere, I., Niekerk, J. van, & Carroll, M. 2012. Assessing Information Security Culture: A Critical Analysis of Current Approaches. Information Security for South Africa: 1–8.

Parker, D., Lawrie, M., & Hudson, P. 2006. A framework for understanding the development of organisational safety culture. Safety Science, 44: 551–562.

Paulk, M. C., Curtis, B., Chrissis, M. B., & Weber, C. V. 1993. Capability maturity model, version 1.1. IEEE Software, 10(4): 18–27.

Ramachandran, S., Rao, C., Goles, T., & Dhillon, G. 2012. Variations in Information Security Cultures across Professions: A Qualitative Study. Communications of the Association for Information Systems, 33(11): 163-204.

Reason, J.T., 1997. Managing the Risks of Organizational Accidents. Ashgate, Aldershot.

Reid, R., & Van Niekerk, J. 2014. Towards an Education Campaign for Fostering a Societal, Cyber Security Culture. Proceedings of the Eighth International Symposium on Human Aspects of International Security & Assurance (HAISA).

Schein, E. H.  1985. Organizational culture and leadership. San Francisco, CA: Jossey-Bass.

Schein, E. H. 1996. Three cultures of management: The key to organizational learning. MIT Sloan Management Review, 38(1): 9.

Thomson, K. & Von Solms, R. 2006. Towards an Information Security Competence Maturity Model. Computer Fraud & Security, 5: 11-15.

The University of Queensland. 2008. Minerals Industry Risk Management Maturity Chart. Brisbane, Australia: University ofQueensland Minerals Industry Health & Safety Centre.

Van Niekerk, J. F., & Von Solms, R. 2010. Information security culture: A management perspective. Computers & Security, 29(4): 476–486.

Von Solms, R., & Van Niekerk, J. 2013. From information security to cyber security. Computers & Security, 38: 97-102.

Westrum, R., 1993. Cultures with requisite imagination. In: Wise, J.A., Hopkin, V.D., Stager, P. (Eds.), Verification and Validation of Complex Systems: Human Factors Issues: 401-416. New York: Springer-Verlag.

Whitman, M.E., & Mattord, H.J. 2009. Principles of information security, 3rd ed. Kennesaw State University: Thompson Course Technology.

Whitman, M.E., & Mattord, H.J. 2003. Principles of Information Security. Kennesaw State University: Thomson Course Technology.

Young, W., & Leveson, N. 2013. System Thinking for Safety and Security. Proceedings of the 2013 Annual Computer Security Applications Conference (ACSAC 2013).

About the author: elizabeth

Leave a Reply

Your email address will not be published.