Spear Phishing in Critical Infrastructure: Potential Applications of Mixed Methods Action Research

“Amateurs hack systems, professionals hack people.”  – Bruce Schneier, American cryptographer and writer.

Abstract

Spear phishing is a growing attack vector in critical infrastructure sectors. The majority of cybersecurity literature focuses on presenting technical solutions to technical problems; due to the complex nature of this type of social engineering attack (which encompasses both technical and non-technical aspects), spear phishing has largely been ignored in cybersecurity literature. Existing literature has shown training exercises alone to be ineffective. When applied using a rigorous methodology, Action Research, a means by which action and research can be achieved concurrently, can produce relevant results for the researcher-practitioner in addressing a real-world problem. When combined with a mixed method (or multi-method) approach, both qualitative and quantitative data can be generated that provide insight into the required course of action to minimize the number of successful attacks. Given the evolving complexity of spear phishing and the constraints of each organization, mixed methods action research (MMAR) is proposed as a viable approach for organizations to improve their cybersecurity through generating insights that extend “beyond the numbers.” This paper contributes to the literature by identifying one area that currently defies traditional research approaches (spear phishing) and demonstrating that MMAR has the potential to help organizations address this real-world problem while generating new research insights.  

Introduction

A real-world problem that currently defies traditional research is spear phishing. Phishing is a type of social engineering attack, the purpose of which is to entice people into sharing sensitive information such as their login and credentials to personal accounts or providing access to their computer systems. It is most often achieved through sending emails requesting the recipient visit fraudulent web sites, open infected attachments, or prompting them to install (malicious) software on their computers. Spear phishing is defined as a focused, targeted attack via email on a specific person or organization, with the goal to penetrate their defenses (Sjouwerman, 2015). A more targeted attack is Business Email Compromise (BEC), also referred to as whaling, human hacking, or CEO fraud, where a carefully crafted message – a “spear” is sent to a specific employee at an opportune moment (Yu, 2016).

As more data breaches occur and critical information falls into the hands of potential spear phishers, it becomes increasingly easy to target specific entities and individuals. A data breach, announced in 2015, targeting the records of “all personnel data for every federal employee, every federal retiree, and up to one million former federal employees” (up to 18 million) held by the United States Office of Personnel Management (OPM) has resulted in the compromise of information that ideally positions spear phishers to identify and target government personnel (Perez & Prokupecz, 2015).  The information lost in the breach includes security clearance interview information, which for higher levels (secret, top secret, and beyond) necessitates that the interviewees (employee and their references) disclose sensitive information that could be used for blackmail at a later date. This presents a real risk to any organization, as information that only a trusted source may have access to can be used in a spear phishing campaign to create increasingly focused and thereby effective attacks, whether the objective is convincing a CFO to wire money to a foreign country (Hubbard, 2015), steal sensitive research and development information (Gallagher, 2011), or other nefarious activities.

The Problem: Spear Phishing

Spear phishing attacks are often successful because they send customized, seemingly credible emails that appear to come from a trusted source. To increase the likelihood of success, these attacks are generally completed after careful research on the target (Parmar, 2012). Purkait (2014) conducted a comprehensive literature review, analyzing 16 doctoral theses and 358 papers related to phishing based on their research focus, empirical basis on phishing, and proposed countermeasures. The findings by Purkait (2014) reveal

[…] different approaches proposed in past are all preventive by nature. Phishers continually target the weakest link in the security chain, namely consumers, in their attacks. Various usability studies have demonstrated that neither server-side security indicators nor client-side toolbars and warnings are successful in preventing vulnerable users from being deceived (Purkait, 2014: 407)

This is due to: phishers being able to convincingly imitate the appearance of legitimate web sites; users tending to ignore security indicators or warnings; users not necessarily interpreting security cues appropriately (Purkait, 2014).

Spear phishing is a significant issue in critical infrastructure environments. In fiscal year 2014, ICS-CERT received 245 incident reports, of which spear phishing accounted for 17% of the known access vectors; a value that increased to 37% in fiscal year 2015 (Department of Homeland Security, 2014; ibid., 2015). Despite the reports that identify spear phishing as the single largest attributable access vector identified in two consecutive years, the two guidance reports provided by the Department of Homeland Security focus exclusively on technical protection issues and solutions. The paper “Seven Steps to Effectively Defend Industrial Control Systems”, published in 2015, presents seven strategies that can be implemented to counter common exploitable weaknesses in “as-built” control systems, none of which address reducing spear phishing attacks. A passing reference to security training is buried within the paper “Recommended Practice: Improving Industrial Control Systems Cybersecurity with Defense-In-Depth Strategies” (2009), advising that CI operators should leverage “common security awareness programs, such as those listed in NIST SP800-50, Building an Information Technology Security Awareness and Training Program,”  while acknowledging that “Formal training can often be cost prohibitive, but good information can be gleaned from books, papers, and websites on cyber and industrial control systems security” (Department of Homeland Security, 2009: 27).

Recent research has indicated that most security training is insufficient; it occurs infrequently (annually) and the effect of the training is often lost in as little as 28 days (Kumaranguru et al., 2009). In 2010, the Institute for Information Infrastructure Protection (I3P) initiated research that addressed ways to accomplish two key industry goals: reduce the vulnerability to spear phishing by training employees to recognize it, and improve an organization’s “security culture” by encouraging employees to report spear phishing incidents (Caputo et al., 2014). The research consisted of deploying a test spear phishing email and providing immediate embedded training if the recipient failed the test (clicked on a link in the email). While the majority of interviewees believed that the kind of embedded security awareness and training employed in the study is more effective than the once-yearly mandatory training they received, the study indicated immediate feedback and tailored framing were insufficient in reducing click rates or increasing reporting (Caputo et al., 2014).

A Potential Solution: Mixed Methods Action Research (MMAR)

Action research (AR) is a generic term which covers many forms of action-oriented research. It is referred to by multiple names, including: participatory research, collaborative inquiry, emancipatory research, action learning, and contextual action research. The outcomes are both an action and research, unlike traditional positivist science which aims at creating knowledge only. Described as “an interventionist approach to the acquisition of scientific knowledge that has foundations in the post-positivist tradition” (Baskerville & Wood-Harper, 1996), AR aims to “solve current practical problems while expanding scientific knowledge” (Myers, 2009:59).

The concept of “action research” is surprisingly old; Kurt Lewin, a German-American social psychologist and educator, is generally credited as the individual to first coin the term “action research” when describing his work in the mid-1940s. In the mid-1950s, action research was attacked as “unscientific, little more than common sense, and the work of amateurs” (McFarland & Stansell, 1993: 15). Action research suffered a further decline in favour during the 1960s because of its association with radical political activism (Stringer, 2007:9).  As experiments with research designs and quantitative data collection became the norm, general interest in action research waned. Action research re-emerged in education and social sciences research during the 1970s, as practitioners questioned the applicability of scientific research designs and methodologies as a means of solving education issues. This shift in perspective was the result of many federally funded projects which were seen as theoretical, and not grounded in practice. AR has since moved into other areas where researchers-practitioners wish to solve real-world problems and generate new knowledge concurrently; action research for organizational science has begun to appear as recently as this century (Coughlan & Coghlan, 2002).

Coughlan & Coghlan (2002) reviewed the work of multiple authors in order to create a list of four broad characteristics of action research: (1) it focuses on research in action, rather than research about action; (2) it is participative; (3) it is research concurrent with action; and (4) it is both a sequence of events and an approach to problem solving (Coughlan & Coghlan, 2002:202). Westbrook (1995) presented AR as an approach that could overcome three deficiencies associated with traditional research topics and methods: (1) it has broad relevance to practitioners; (2) it has applicability to unstructured or integrative issues; and (3) it can contribute to theory. In summary, the desired outcomes of AR approach are “not just solutions to the immediate problems but important learning from outcomes both intended and unintended, and a contribution to scientific knowledge and theory” (Coughlan & Coghlan, 2002:203).

Action Research is often associated with a pragmatic worldview; namely, using whatever approach best aids the researcher. Pragmatism is often associated with mixed methods (Creswell, 2009); thus, the use of a mixed methods approach in action research is often appropriate.  Action research studies that employ both quantitative and qualitative research methods within a mixed methods approach are referred to as Mixed Methods Action Research (MMAR) studies (Ivankova, 2014). Mixed methods research is research “in which the investigator collects and analyzes data, integrates the findings, and draws inferences using both qualitative and quantitative approaches or methods in a single study or program of inquiry” (Tashakkori & Creswell, 2007:4), and differs drastically from the mono-method approach of employing either quantitative or qualitative methods. Some individuals argue that it is inappropriate to integrate quantitative and qualitative methods due to fundamental differences in the philosophical paradigms underlying those methods – this belief that the two methods are incompatible is commonly referred to as the incompatibility thesis (Teddlie & Tashakkori, 2009). The fundamental principle of mixed methods research implies, contrary to this belief, that “methods should be mixed in a way that has complementary strengths and non-overlapping weaknesses” (Johnson & Turner, 2003: 299). Many of the concerns of incompatibility theorists can be addressed through adjusting the relative importance of the quantitative and qualitative methods through modifications to priority or weighting, for answering the study’s questions (Creswell & Plano Clark, 2011; Teddlie & Tashakkori, 2009).

The linking of the terms action and research highlight the essential features of MMAR (Kemmis & McTaggart, 1988). Typically, action research is characterized by spiralling cycles of problem identification, systematic data collection, reflection, analysis, data-driven action taken, and, finally, problem redefinition. Lewin (1946) described the process as cyclical, involving a “non-linear pattern of planning, acting, observing, and reflecting on the changes in the social situations” (Noffke & Stevenson, 1995:2). His approach involves a spiral of steps, “each of which is composed of a circle of planning, action and fact-finding about the result of the action” (ibid.: 206). The basic cycle is depicted as follows:

Picture1
Figure 1 Lewin Action Research Spiral circa 1946

 

This process has been refined by multiple researchers, resulting in many action research spirals now being available to researchers (Smith et al., 2007). They are generally canonical in that the cycles are repeated until minimal additional insight can be generated and the researcher deems the project complete (Ivakova, 2015). The processes range from the simplistic look-think-act AR spiral put forward by Stringer (1999) (Figure 2), through to the rigorous and complex AR spirals, such as the model put forward by Coughlan & Coghlan (2002) (Figure 3).

 

 

Figure 2 – Look-Act-Think Action Research Cycle (Stringer, 1999)
Figure 2 – Look-Act-Think Action Research Cycle (Stringer, 1999)

 

 

Figure 3 - Canonical Action Research Cycle (Adapted from Coughland & Coghlan, 2002)
Figure 3 – Canonical Action Research Cycle (Adapted from Coughland & Coghlan, 2002)

 

The diversity in theory and practice among action researchers provides a wide choice for potential action researchers as to what might be appropriate for their specific research question (Reason & Bradbury, 2001). Although the number of stages can vary, a five-stage model of (1) problem diagnosis, (2) action planning, (3) action taking, (4) evaluation, and (5) learning is common in AR cycles (Baskerville & Wood-Harper, 1998). The most recent and comprehensive spiral is that of Ivankova (2014) (Figure 4). It follows a similar format, showing the canonical approach to the research, which ideally continues until no further information can be generated and the research objectives have been met.

 

Figure 4 - Mixed Methods Action Research (MMAR) cycle (Ivankova, 2014)
Figure 4 – Mixed Methods Action Research (MMAR) cycle (Ivankova, 2014)

 

Problem + Solution: MMAR and Spear Phishing

As noted above, spear phishing – particularly attacks against critical infrastructure – is a real-world problem that currently cannot be solved using a traditional, positivist research approach. A MMAR approach would be ideal, as qualitative and quantitative methods will likely generate the insights required to adapt internal training to help reduce click rates on spear phishing emails. Repeating the I3P approach using an action research cycle would allow researchers the opportunity to test different variables based on the insights generated through each cycle. More specifically, the ability to conduct a test spear phishing campaign, analyze the quantitative results (click rates), and seek out qualitative input (interviews with individuals who clicked). This would be research in action, rather than research about action, it would be participative (research performed with the participants, not on the participants), the research would be concurrent with action, and it would be both a sequent of events and an approach to a problem – thus meeting the characteristics as identified by Coughlan & Coghlan (2002).  Using the stages proposed by Ivankova (2014), a possible research approach would be as follows:

Phase Phase Description Action Items/Outputs  (Example)
Diagnosing Practitioner-researchers conceptualize the problem that requires solution in the workplace or other community setting, and identify the rationale for investigating it by using both quantitative and qualitative methods (Ivankova, 2014).

 

 

Action items:

·  Carefully craft a question (or multiple questions) that permits the investigation of this issue in a format that permits both action and research.

·  Determine which mixed methods designs are most appropriate for resolving the issue of spear phishing for organization.

·  Provide rationale for investigating it using both qualitative and quantitative methods.

Outputs:

·       Qualitative interviews with employees at all levels will help to obtain insight into problems

·       A quantitative survey would help obtain insight into actual understanding.

·       Argument that the two methods – whether concurrent or consecutive – will provide more insight than a mono-method approach.

Reconnaissance, or fact finding Preliminary assessment of the identified problem or issue is conducted using mixed methods in order to develop a plan of action/intervention (Ivankova, 2014). Action items:

·  Conduct a literature review

·  Outline the current understanding regarding issues related to spear phishing within organization.

Outputs:

·       Preliminary assessment using qualitative or quantitative measures. Review of facts identified in order to develop a plan of action/intervention.

Planning Action/intervention plan is developed based on mixed methods inferences from the reconnaissance phase (Ivankova, 2014). Action item:

·  Action/intervention plan is developed.

Output:

·  An action / intervention plan that models the I3P approach, whereby a test is issued and the results are followed up via qualitative interviews.

·  Supplemented with additional training, to build on pre-existing knowledge re: poor retention of spear phishing training.

Acting Action/intervention plan is implemented (Ivankova, 2014). Action items:

If I3P model is followed:

·  Deploy a test to generate quantitative results.

·  Split testing of group receiving supplemental training vs. no training.

Outputs:

·  Data regarding efficacy of Group A (training) vs Group B (no advanced training)

Evaluation Action/intervention is evaluated using mixed methods to learn whether it produces the desired outcomes (Ivankova, 2014). Action items:

·  Follow up with interviews to generate qualitative insights regarding the efficacy of spear phishing campaigns.

·  Review insights generated through the campaign (both via qualitative and quantitative data)

Outputs:

·  Educated opinion on efficacy of the current action/intervention plan and if it needs adjustment.

Monitoring Practitioner-researchers make decisions about whether the revisions or further testing of an action/intervention plan is needed based on mixed methods inferences from the evaluation phase (Ivankova, 2014). Action items:

·  Determine whether revision or further testing is needed (ie. has the click-rate of spear phishing campaigns been sufficiently reduced, or is additional and/or different training / intervention required?)

 

Outputs:

·  If additional research is required, inferences from the evaluation phase are used to determine the next course of action.

 

Conclusion

Mixed Methods Action Research (MMAR) is an approach that has not been utilized to better understand the field of cybersecurity or cybersecurity training. MMAR could be a good fit for solving the issue of spear phishing; the cyclical approach that combines both qualitative and quantitative methods would allow a practitioner-researcher to develop a solution suitable for their environment while generating valuable knowledge for broader application in relation to cybersecurity problems.

This article serves to provide one more tool for a researcher to consider when determining what approach is best suited to their environment. It has potential to be a good fit with complex problems that have not been solved using a positivist approach, such as spear phishing. It is likely that if applied in the above mentioned scenario, it would meet the overarching objective of solving a current practical problem while expanding scientific knowledge (Baskerville & Myers, 2004).

There are limitations to this discussion; specifically, it is theoretical in nature. While this article may provide the foundation for a potential MMAR study in cybersecurity, it does not provide these insights or a solution. Cybersecurity managers and trainers are encouraged to consider this approach when approaching real-world problems that cannot be addressed using traditional, positivist research.

References

Baskerville, R., & Myers, M. D. 2004. Special Issue on Action Research in Information Systems: Making IS Research Relevant to Practice: Foreword. The Mississippi Quarterly, 28(3): 329–335.

Baskerville, R. L., & Wood-Harper, A. T. 1996. A critical perspective on action research as a method for information systems research. Journal of Information Technology Impact, 11(3): 235–246.

Caputo, D. D., Pfleeger, S. L., Freeman, J. D., & Johnson, M. E. 2014. Going Spear Phishing: Exploring Embedded Training and Awareness. IEEE Security Privacy, 12(1): 28–38.

Coughlan, P., & Coghlan, D. 2002. Action research for operations management. International Journal of Operations & Production Management, 22(2): 220–240.

Creswell, J. W., & Plano Clark, V. L. 2011. Designing and Conducting Mixed Methods Research. SAGE Publications Inc.

Creswell, J. W. 2009. Research Design: Qualitative, Quantitative, and Mixed Methods. Los Angelos: SAGE Publications.

Evan Perez and Shimon Prokupecz, C. 2015, June 22. U.S. government hack could actually affect 18 million. CNN. http://www.cnn.com/2015/06/22/politics/opm-hack-18-milliion/index.html.

Gallagher, S. 2011, November 1. “Nitro” spear-phishers attacked chemical and defense company R&D. Ars Technica. http://arstechnica.com/business/2011/11/nitro-spear-phishers-attacked-chemical-and-defense-company-rd/.

Hubbard, R. February 5, 2015. Impostors bilk Omaha’s Scoular Co. out of $17.2 million. Omaha.com. http://www.omaha.com/money/impostors-bilk-omaha-s-scoular-co-out-of-million/article_25af3da5-d475-5f9d-92db-52493258d23d.html.

Ivankova, N. V. 2014. Mixed methods applications in action research: From methods to community action. SAGE Publications.

Johnson, B., & Turner, L. A. 2003. Data collection strategies in mixed methods research. Handbook of Mixed Methods in Social and Behavioral Research, 297–319.

Kumaraguru, P., Cranshaw, J., Acquisti, A., Cranor, L., Hong, J., et al. 2009. School of Phish: A Real-world Evaluation of Anti-phishing Training. Proceedings of the 5th Symposium on Usable Privacy and Security, 3:1–3:12. New York, NY: ACM.

Lewin, K. 1946. Action Research and Minority Problems. The Journal of Social Issues, 2(4): 34–46.

McFarland, & Stansell. 1993. Historical perspectives. In L. Patterson, C.M. Santa, C.G. Short, & K. Smith (Ed.), Teachers Are Researchers: Reflection and Action. Newark, DE: International Reading Association.

McTaggart, R., & Kemmis, S. 1988. The action research planner. Deakin University.

Myers, M. D. 2013. Qualitative Research in Business and Management. SAGE Publications.

Noffke, S. E., & Stevenson, R. B. 1995. Educational action research: Becoming practically critical. Teachers College Press.

Parmar, B. 2012. Protecting against spear-phishing. Computer Fraud & Security, 2012(1): 8–11.

Purkait, S. 2012. Phishing counter measures and their effectiveness – literature review. Information Management & Computer Security, 20(5): 382–420.

Reason, P., & Bradbury, H. 2001. Handbook of action research: Participative inquiry and practice. SAGE Publications.

Sjouwerman, S. 2015. Confronting “Spear Phishing” Etc. Privacy Journal, 41(7): 3–4.

Stringer, E. T. 2007. Action Research, vol. 4th Edition. Thousand Oaks, CA: SAGE Publications.

Tashakkori, A., & Creswell, J. W. 2007. Editorial: The New Era of Mixed Methods. Journal of Mixed Methods Research, 1(1): 3–7.

Teddlie, C., & Tashakkori, A. 2009. Foundations of mixed methods research: Integrating quantitative and qualitative approaches in the social and behavioral sciences. SAGE Publications Inc.

Westbrook, R. 1995. Action research: a new paradigm for research in production and operations management. International Journal of Operations & Production Management, 15(12): 6–20.

Yu, R. 2016, April 6. Cyber criminals go spear phishing, harpoon executives. Third Certainty. http://thirdcertainty.com/infographics/cyber-criminals-go-spear-phishing-harpoon-executives/.

Yue, C., & Wang, H. 2010. BogusBiter: A Transparent Protection Against Phishing Attacks. ACM Transactions on Internet Technology, 10(2): 1-31.

ICS-CERT Monitor. November-December 2015. Department of Homeland Security. https://ics-cert.us-cert.gov/sites/default/files/Monitors/ICS-CERT%20Monitor_Nov-Dec2015_S508C.pdf.

Recommended Practice: Improving Industrial Control Systems Cybersecurity with Defense-In-Depth Strategies. 2009. Department of Homeland Security. https://ics-cert.us-cert.gov/sites/default/files/recommended_practices/Defense_in_Depth_Oct09.pdf. 

Seven Steps to Effectively Defend Industrial Control Systems. 2015. Department of Homeland Security. https://ics-cert.us-cert.gov/sites/default/files/documents/Seven%20Steps%20to%20Effectively%20Defend%20Industrial%20Control%20Systems_S508C.pdf.

 

 

 

 

 

Cybersecurity Culture: A Definition and Maturity Model For Critical Infrastructure

“If you get the culture right, most of the other stuff will just take care of itself.” – Tony Hsieh, CEO of Zappos

Abstract

As the world becomes more interconnected, cybersecurity continues to be a growing concern in critical infrastructure environments. Neither policies and procedures nor technical solutions can consider all possible scenarios; however, an environment where cybersecurity is considered the norm and all users are engaged in protecting systems may be able to mitigate these limitations. Organizational culture has been proposed as a means by which desired behavioural outcomes can be achieved. We investigate current applications of Schein’s (1985) three-part model to information security culture and safety culture, and propose a new definition of a culture of cybersecurity. Leveraging these insights and drawing on the concept of maturity models, we investigate whether it is possible to draft a cybersecurity culture maturity model. If successful, this would represent a contribution in the field of cybersecurity, as it could draw together multiple concepts from information security, safety culture, and management literature related to organizational culture, in order to address an issue relevant to top management teams and those in governance positions. It would also be of interest to national policymakers as it highlights some of the inconsistencies with the current practices regarding cybersecurity in critical infrastructure.

Introduction

Culture has been defined as a “system of shared values (defining what is important) and norms (defining appropriate attitudes and behaviors)” (Chatman & Cha, 2003: 21). Adapted from anthropology for organization management research, the concept of organizational culture has been identified as a way to encourage desired behaviours within an organization (Chang & Lin, 2007). The most well-known and simplest definition of organizational culture is ‘‘the way things are done here’’ (Lundy & Cowling, 1996: 168); however, the concept of organizational culture is more complex and potentially more powerful when viewed in terms of the impact on behaviour within an organization.  The recent trend to support the priority area of cybersecurity is no exception; there have been recent calls to create a culture of cybersecurity (Muegge & Craigen, 2015). It is believed that by generating a culture of cybersecurity that “groups and individuals could practice safe computing and would expect others to do so. IT systems would be promptly patched, and secure best practices would be the norm” (Muegge & Craigen, 2015: 13). In order to determine whether or not this is realistic, it is worth examining areas where similar calls-to-action regarding developing a “culture of” has occurred, including safety and information security.

Information Security Culture (ISC) has been a topic of research since the turn of this century, whereby researchers have put forward a litany of definitions of ISC, theories on the roots of ISC, and frameworks for cultivating ISC. Karlsson et al. (2015) identify that there has been minimal research, however, into the fruits of ISC. By contrast, Safety Culture has articles dating back as far as the early 1950s (Keenan et al., 1951); this concept is comparatively mature and extensively researched. Although the empirical research on the topic has progressed, theory has remained relatively stagnant for the past 20 years (Guldenmund, 2000). Despite the stagnation, a number of Safety Culture Maturity Models have emerged and have been applied to safety culture development within a number of “high hazard” industries such as aviation, rail, and petrochemical industries (Foster & Hoult, 2013). While Cybersecurity Capability Maturity Models have been successful in improving the “maturity and efficacy of controls employed to secure critical infrastructures” (Miron & Muita, 2014: 34), an equivalent Maturity Model for Cybersecurity Culture has not been developed – and it is uncertain whether the development of such a solution would be beneficial.

In this article, we investigate whether a cybersecurity culture maturity model would provide managers with a tool to assess the cybersecurity culture within their organization. If successful, this would represent a contribution in the field of cybersecurity, as it could draw together multiple concepts from information security, safety culture, and management literature related to organizational culture, in order to address an issue relevant to top management teams and those in governance positions. It would also be of interest to national policymakers as it highlights some of the inconsistencies with the current practices regarding cybersecurity in critical infrastructure.

This document is organized as follows. In the first section, we provide a literature review outlining recent relevant literature associated with culture, organizational culture, information security culture, safety culture and maturity models. In the second section, we provide our preferred construct for the term cybersecurity and present a new definition for cybersecurity culture. In the third section, we relate our definition to that of the original definition provided by Schein (1985) regarding culture (espoused values, shared tacit assumptions, and artifacts) and the extension provided by Von Solms and Van Niekerk (2010) (knowledge). In the fourth section, we propose a five-step cybersecurity maturity model, discussing the strengths and weaknesses of this approach. In the final sections, we present our conclusions as well as areas for further research.

Literature Review

Organizational Culture

Culture is traditionally defined as consisting of “the abstract values, beliefs, and perceptions of the world that lie behind people’s behavior and that are reflected by their behaviour” (Haviland et al., 2009: 33). The concept of culture has subsequently been adopted by management literature under the label of organizational culture (or “corporate culture”), wherein culture is believed to influence the behaviour of employees and contribute to the effectiveness of an organization (Thomson & von Solms, 2006). Acceptable and unacceptable conduct within the context of corporate culture is determined through policies and procedures (Whitman & Mattord, 2003:194). Schein’s model (1985) has considerable influence in this domain, whereby organizational culture is defined as: “a pattern of shared tacit assumptions that was learned by a group as it solved its problems of external adaptation and internal integration, that has worked well enough to be considered valid and, therefore, to be taught to new members as the correct way to perceive, think, and feel in relation to those problems” (1985: 27).  Schein (1985) describes culture as existing in three levels: artifacts, espoused values and shared tacit assumptions. In later papers he defined these as:

the level of deep tacit assumptions that are the essence of the culture, the level of espoused values that often reflect what a group wishes ideally to be and the way it wants to present itself publicly, and the day-to-day behavior that represents a complex compromise among the espoused values, the deeper assumptions, and the immediate requirements of the situation (Schein, 1996: 9).

It is widely believed that organizational culture consists of various subcultures that influence the overall culture, such as professional groups (Ramachandran, Rao, Goles & Dhillon, 2013) as well as national cultures (Helmreich & Merrit, 2001).

Safety Culture

Safety culture and safety climate constructs have been extensively discussed in academic literature; the earliest located paper on safety climate is Keenan et al. (1951) though more recent works include Guldenmund (2000) who conducts a comprehensive review and identifies 17 distinct definitions of safety culture (and the similar concept of safety climate) of which the majority are “global and therefore implicit” (2000:229). In 1991, the report on safety culture by the International Nuclear Safety Advisory Group indicated that safety culture consists of two elements: (1) a necessary framework within an organisation, and (2) the attitude of staff at all levels in responding to and benefitting from the framework (INSAG, 1991.).  The 1993 Advisory Committee on the Safety of Nuclear Installations (ACSNI) definition (subsequently adopted by Lee in 1996) is the most explicit, defining safety culture as:

the product of individual and group values, attitudes, perceptions, competencies, and patterns of behaviour that determine the commitment to, and the style and proficiency of, an organisation’s health and safety management.

Multiple authors call for an integrative framework that takes into account the organizational culture as a whole (Gordon et al., 2007; Guildenmund, 2000).

Information Security Culture

Information Security Culture (ISC) has been a topic of research since the turn of this century, whereby researchers have put forward various definitions of ISC, theories on the roots ISC, and frameworks for cultivating ISC (Karlsson et al., 2015). VanNiekerk & Von Solms (2010) adapted Schein’s 1985 three-level model of culture and added a fourth level – information security knowledge – which supports the other three levels. ISC is defined as:

the attitudes, assumptions, beliefs, values and knowledge that employees/stakeholders use to interact  with the organisation’s systems and procedures at any point in time. The interaction results in acceptable or unacceptable behaviour (i.e. incidents) evident in artifacts and creations that become part of the way things are done in the organisation to protect its information assets. This information security culture changes over time. (Da Veiga & Eloff, 2010:198).

Karlsson et al. (2015) provide an excellent summary of the state-of-the-art in relation to information security culture research and point out that minimal research has demonstrated the fruits of Information Security Culture. The frameworks that exist for measuring ISC, of which Da Veiga and Eloff (2010) is the most comprehensive and extensively cited, are often unnecessarily complex and have not been widely adopted by industry. 

Summary

Organizational culture is complex, with multiple variables influencing the overall culture of an organization (profession, nationality, etc.). The concept of organizational culture has been adopted to help with developing desirable traits, such as an information security culture or a safety culture.  In more recent literature (Muegge & Craigen, 2015) there has been a call for a Culture of Cybersecurity. There is minimal literature in relation to a culture of cybersecurity; the nearest parallels can be drawn instead from information security culture or a safety culture. In the following section, we seek to generate a definition for a culture of cybersecurity and explore whether this can be converted into an employable management tool.

Cybersecurity Culture – Background & Definition

Defining Cybersecurity Culture

The terms information security and cybersecurity are often used interchangeably; however, Von Solms & Van Niekerk (2013) suggest they are not analogous – namely, some cybersecurity threats do not form part of the formally defined scope of information security. Whitman & Mattord (2009) define information security as “the protection of information and its critical elements, including the systems and hardware that use, store, and transmit that information” (Whitman & Mattord, 2009: 8). The international standard, ISO/IEC 27002 (2005), defines information security as “the preservation of the confidentiality, integrity and availability of information” whereby information can take on many forms: printed or written on paper, stored electronically, transmitted by post or electronic means, shown on films, conveyed in conversation, and so forth” (2005: 1). In settings where there are historical information archives that have not yet been digitized, this includes archival information. By contrast, cybersecurity has several “highly variable, often subjective, and at times, uninformative” definitions currently in circulation (Craigen et al., 2014:13). The preferred definition of cybersecurity is the construct upon which a cybersecurity culture can be developed & defined is that of the International Telecommunications Union (ITU):

Cybersecurity is the collection of tools, policies, security concepts, security safeguards, guidelines, risk management approaches, actions, training, best practices, assurance and technologies that can be used to protect the cyber environment and organization and user’s assets. Organization and user’s assets include connected computing devices, personnel, infrastructure, applications, services, telecommunications systems, and the totality of transmitted and/or stored information in the cyber environment. Cybersecurity strives to ensure the attainment and maintenance of the security properties of the organization and user’s assets against relevant security risks in the cyber environment. The general security objectives comprise the following: Availability, Integrity, which may include authenticity and non-repudiation, and Confidentiality (ITU, 2008). 

Next, it’s important to understand pre-existing definitions regarding organizational culture. The original adaptation of Schein’s (1985) concept of organizational (corporate) culture by Van Niekerk & Von Solms (2010) is as follows:

 

Picture5
Figure 1–  Levels of Information Security Culture. Von Solms & Van Niekerk (2010) adapted from Schein (1999).

For this paper, we believe that an adaptation of the definition of safety culture that aligns with the both Schein (1985) and Von Solms & Van Niekerk (2010) is most appropriate. These source definitions can be summarized as follows:

Author Term Definition
Schein (1996) Organizational Culture A pattern of shared tacit assumptions that was learned by a group as it solved its problems of external adaptation and internal integration, which has worked well enough to be considered valid and, therefore, to be taught to new members as the correct way to perceive, think, and feel in relation to those problems.
ACSNI (1993), Lee (1996) Safety Culture The product of individual and group values, attitudes, perceptions, competencies, and patterns of behaviour that determine the commitment to, and the style and proficiency of, an organisation’s health and safety management.
Da Veiga & Eloff (2010) Information Security Culture The   attitudes, assumptions, beliefs, values   and knowledge that employees/stakeholders use to interact with  the organisation’s  systems  and  procedures  at  any point  in  time.  The interaction results in acceptable or unacceptable behaviour (i.e. incidents) evident in artifacts and creations that become part of the way things are done in the organisation to protect its information assets. This information security culture changes over time.

 

We define cybersecurity culture as follows:

The attitudes, assumptions, beliefs, values and knowledge of individuals and groups that determine the commitment to, and proficiency of, an organization’s cybersecurity management at any point in time.

This definition includes the three-part definition of culture provided by Schein (1985), and is sufficiently broad as to encapsulate the additional fourth part (knowledge) provided by Van Niekerk & Von Solms (2010) and draws on the Lee (1996) definition of safety culture.

Okere et al. (2012) summarize and explain the concepts of cybersecurity culture, namely artifacts, espoused values, tacit assumptions, and knowledge in relation to information security; they can be adapted to cybersecurity as follows:

  • Artifacts: While this level is visible, it is difficult to interpret without additional insight from “insiders” within the organization. Examples include: cybersecurity handbooks, awareness courses, policies, choice of technology, behavior patterns, and language usage.
  • Espoused values: This layer is partially visible and unspoken, representing the purpose given by insiders of an organization regarding the observed artifacts. Examples include: documents that describe the values, principles and vision of the organization.
  • Shared tacit assumptions: If the organization continues to be successful, the beliefs and values become shared and taken for granted and form the core of the organization’s culture. No examples are provided, as by definition tacit assumptions are not outwardly expressed.
  • Knowledge: Contributing to the above levels, knowledge is believed to help in ensuring cybersecurity compliance. Examples of knowledge include: competence, knowledge sharing practices, access to internal and external resources regarding cybersecurity, learnings from previous cybersecurity events.

It is against this definition that we endeavored to determine whether the development of a cybersecurity culture maturity model for use within and across critical infrastructure sectors is possible.

Developing a Cybersecurity Culture Maturity Model

The objective of this paper is to explore whether a cybersecurity culture maturity model appropriate for critical infrastructure is possible. In this section, we explore the concept of cybersecurity capability maturity models, safety culture maturity models, and the potential for an amalgamation of the two models in order to encapsulate culture into existing capability models for cybersecurity. In particular, we focused on safety culture literature as the “increasing pervasiveness of cyber-based systems and infrastructure, and society’s growing reliance on them, shifts the perspective concerning their proper operation from security to safety” (Dhouba et al., 2014:41). Many researchers contend that security properties are included in safety properties (Leveson, 2013; Young & Leveson, 2013). Safety is often associated with minimizing unintended disruption, and security is often associated with minimizing intended disruption; both concepts affect the proper operation of cyber-based systems and infrastructure (Dhouba et al., 2014:41). From this research, we intend to integrate the key elements of the selected safety culture maturity model(s) with cybersecurity requirements.

Cybersecurity Capability Maturity Models

The application of the Capability Maturity Model in relation to software was first published in 1993 as a means by which organizations could measure the differences between immature and mature organizations (Paulk, Curtis, Chrissis, & Weber, 1993); this concept has since been adopted by a multiplicity of domains that seek to have a logical progression monitored and measured. In cybersecurity research, Cybersecurity Capability Maturity Models (CCMMs) are plentiful, providing well-defined criteria against which managers can measure the maturity of their preparedness against cyber-threats (Miron, 2015; Miron & Muira, 2014); however, navigating the many standards is complicated, and has both time and cost implications (Miron & Muita, 2014).

While well-defined criteria against which managers can measure the maturity of their cybersecurity capabilities exist, frameworks to help evaluate information security and/or cybersecurity culture are minimal and difficult to navigate. No maturity models for measuring the maturity of an organizations’ cybersecurity culture were available at the time of writing (2016) in academic, peer-reviewed literature; perhaps because of the normative assumptions underlying a maturity model geared towards assessing culture.

Safety Culture Maturity Models

A number of Safety Culture Maturity Models (referred to herein as SCMMs) have emerged in recent decades within the Critical Infrastructures space (Foster & Hoult, 2013). SCMMs have directly originated from a mixture of organizational development and other capability maturity models used in the software industry in efforts to help organizations assess their existing safety cultures. Theoretically, SCMMs identify which actions must be taken by the organization in order to improve the organization’s culture (Filho et al., 2013). In this context, it is important to note that “maturity” refers specifically to the human elements and behaviours within the culture of an organization, not the maturity of the safety management systems themselves (Foster & Hoult, 2013).There are four well-cited Safety Culture Maturity Models found in the literature: Westrum (1993; 2004), Reason (1993), Fleming (2000), and Hudson (2001) / Parker, Lawrie, Hudson (2006). Each model described above demonstrates the importance of a “strong relationship between the culture of an organization and the development of a systems approach” (Foster & Hoult, 2013: 63). However, “no available data indicates that all organisations follow a sequential maturation and also that the use of averages to determine the level of maturity is appropriate” (Filho et al., 2013: 616). There are concerns that a maturity model for evaluating culture may be normative and distracting from the overall purpose of developing an effective organizational culture (which, by extension, would ensure safety and cybersecurity). As such, the development of a “culture of cybersecurity” and a “cybersecurity capability maturity model” should be viewed as secondary to the overall development of an effective organization that prioritizes both safety and cybersecurity as part of their risk management strategy.

Cybersecurity Culture Maturity Model

If a cybersecurity culture maturity model appropriate for use by top management teams was developed, it would be sensible to begin by reviewing pre-existing safety culture maturity models and proceed by adding additional insights generated through a literature review and interviews with industry experts. As we attempted this process, we began by building the model proposed by Reason (1997) in order show the progressive nature of building a cybersecurity culture: Pathological, Reactive, Bureaucratic, Proactive, and Generative. These five levels were renamed to increase “the accessibility of the framework to industry employees by including terms they would be familiar with” (Parker et al., 2006: 555) as follows: Vulnerable, Reactive, Compliant, Proactive, and Resilient. The five levels align with the Minerals Industry Risk Management (MIRM) Maturity Chart; a Maturity Model published by a team from the University of Queensland (2008) based on Hudson (2001). We adopted and modified the characterizations provided by Parker et al. (2006) in order to characterize in terms of cybersecurity using the following phrases:

Cybersecurity Culture Maturity Model
Figure 2: Cybersecurity culture maturity model levels (Lance and Bacic, 2016, based on Reason, 1997; Parker et al., 2006)

It is important to note that the progression assumes that there is a desired culture against which progression can be measured (Resilient). Culture is significantly more complex; however, a starting point against which managers can begin to take proactive steps may be of use. Ultimately, the risk tolerance of each organization is different; management teams will need to determine whether cybersecurity is appropriate in their domain and balance the needs of their organization with the larger implications of a cybersecurity incident in their critical infrastructure domain.

Contribution

Reid and Van Niekerk (2014) previously investigated the concept of cybersecurity culture and was deliberate in avoiding providing a prescriptive definition; instead, recommending that the definition of cybersecurity culture “should be defined to suit particular contexts.” To build on the call to action regarding the development of a culture of cybersecurity by Muegge and Craigen (2015), the construct of organizational culture was investigated in more depth and current applications of Schein’s (1985) model (information security culture and safety culture) were reviewed. As a result of our research, we propose a new definition of a culture of cybersecurity and put forward a draft cybersecurity culture maturity model step framework for verification in critical infrastructure environments. This draft model is by no means comprehensive; it requires empirical validation. As such, we encourage readers to review the proposed framework and test it within their environments to determine whether it assesses their cybersecurity culture and assists them in determining the next steps required to improve their cybersecurity. The use of such a tool may indirectly provide insight into whether or not an organization is effective in meeting its requirements regarding safety and cybersecurity as a whole.

Conclusion

This article proposes a potential means for increasing cybersecurity in critical infrastructure environments where all potential cybersecurity vulnerabilities cannot be predicted; the development of a cybersecurity culture. Through our research, we provide two contributions to theory. First, we provide an updated definition of cybersecurity culture that is appropriate for use across sectors in critical infrastructure. Second, we have created a draft Cybersecurity Culture Maturity Model for industry review. We have reviewed aspects of culture that are relevant to critical infrastructure, including: organizational culture, information security culture, safety culture, and the associated frameworks and capability maturity models. After reviewing the key elements and benefits of each, we selected and updated the most explicit information security culture (Da Veiga & Eloff, 2010) and safety culture definitions (Lee, 1996) for use as a baseline definition of cybersecurity culture. Further, after reviewing the associated frameworks, we generated a draft modified maturity model suitable for industry professionals to begin testing their cybersecurity culture. This model represents a possible starting point for top management teams to evaluate their culture and associated best practices.

There are limitations to this paper; namely, the cybersecurity culture maturity model is a draft requiring verification. Furthermore, it would be worthwhile expanding the CCMM to include an audit instrument, conducting audits using the instrument in critical infrastructure environments, and comparing the results from multiple audits in order to identify trends that transect critical infrastructure fields. We encourage readers to verify the model in their environments and provide feedback, such that a useful tool for cybersercurity may be generated.

Biography

This working draft paper was co-written by Ariana Bacic.

References

Advisory Committee on the Safety of Nuclear Installations (ACSNI). 1993. Study Group on Human Factors, Third Report: Organising for Safety. HSMO, London.

Chang, S. E., & Lin, C. 2007. Exploring organizational culture for information security management. Industrial Management & Data Systems, 107(3): 438–458

Chatman, J. A., & Cha, S. E. 2003. Leading by leveraging culture. California Management Review, 45(4): 20–34.

Craigen, D. 2014. Assessing Scientific Contributions: A Proposed Framework and Its Application to Cybersecurity. Technology Innovation Management Review, 4(11): 5–13.

Da Veiga, A., & Eloff, J. H. P. 2010. A framework and assessment instrument for information security culture. Computers & Security, 29(2): 196–207.

Da Veiga A, Martins N, Eloff JHP. 2007. Information security culture – validation of an assessment instrument. Southern African Business Review, 11(1):146–66.

Douba, N.Rütten, B.Scheidl, D.Soble, P., & Walsh, D. ’A. 2014. Safety in the Online World of the Future. Technology Innovation Management Review, 4(11): 41-48. http://timreview.ca/article/849

Filho, A. P. G., Andrade, J. C. S., & Marinho, M. M. de O. 2010. A safety culture maturity model for petrochemical companies in Brazil. Safety Science, 48(5): 615–624.

Fleming, M. 2007. Developing safety culture measurement tools and techniques based on site audits rather than questionnaires. Saint Mary’s University. http://pr-ac.ca/files/files/133_FinalReport_12May07.pdf.

Foster, P & Hoult, S. 2013. The Safety Journey: Using a Safety Maturity Model for Safety Planning and Assurance in the UK Coal Mining Industry. Minerals, 3: 59-72.

Gordon, R., Kirwan, B., Perrin, E., 2007. Measuring safety culture in a research and development centre: a comparison of two methods in the Air Traffic Management domain. Safety Science, 45: 669–695.

Guldenmund, F. W. 2000. The nature of safety culture: a review of theory and research. Safety Science, 34(1–3): 215–257.

Haviland, W., Fedorak, S. A., Lee, R. B. 2009. Cultural Anthropology, Third Canadian Edition. Ontario, Canada: Nelson Education.

Helmreich, R. L., & Merritt, A. R. L. 2001. Culture at work in aviation and medicine: National, organizational and professional influences. trid.trb.org.

Hudson, P. 2001. Aviation safety culture. Keynote address at Safeskies Conference: Canberra, November.

ISO27002. 2013. Information technology — Security techniques — Code of practice for information security controls. https://www.iso.org/obp/ui/#iso:std:iso-iec:27002:ed-2:v1:en, October 1.

International Nuclear Safety Advisory Group (INSAG). 1991. Safety Culture, Safety Series No 75- INSAG-4, Vienna:  International Atomic Energy Authority.

International Telecommunication Union. 2009. ITU Corporate Annual Report 2008. https://www.itu.int/osg/csd/stratplan/AR2008_web.pdf, April 22.

Karlsson, F., Åström, J., & Karlsson, M. 2015. Information security culture – state-of-the-art review between 2000 and 2013. Information and Computer Security, 23(3): 246–285

Keenan, V., Kerr, W., & Sherman, W. 1951. Psychological climate and accidents in an automotive plant. Journal of Applied Psychology, 35(2): 108-111.

Lee, T. R. 1996. Perceptions, attitudes and behaviour: the vital elements of a safety culture. Health and Safety Code Annotated of the State of California, Adopted April 7, 1939 / Annotated and Indexed by the Publisher’s Editorial Staff. California, 10: 1–15

Leveson, N. 2013. Engineering a Safer World. Cambridge, MA: MIT Press.

Lundy, O., & Cowling, A. 1996. Strategic Human Resource Strategy. London, Canada: Routledge

Martins, A., Eloff, J.H.P. 2002. Assessing Information Security Culture. ISSA Proceedings, Muldersdrift: South Africa.

Miron, W., & Muita, K. 2014. Cybersecurity Capability Maturity Models for Providers of Critical Infrastructure. Technology Innovation Management Review, 4(10): 33-39.

Miron, W. 2015. Adoption of Cybersecurity Capability Maturity Models in Municipal Governments. Unpublished Masters Dissertation. Carleton University, Ottawa.

Muegge, S. & Craigen, D. 2015. A Design Science Approach to Constructing Critical Infrastructure and Communicating Cybersecurity Risks. Technology Innovation Management Review, 5(6): 6-16. 

O’Donovan, G. 2006. The corporate culture handbook: how to plan, implement and measure a successful culture change. Ashbrook House, Ireland: The Liffey press.

Okere, I., Niekerk, J. van, & Carroll, M. 2012. Assessing Information Security Culture: A Critical Analysis of Current Approaches. Information Security for South Africa: 1–8.

Parker, D., Lawrie, M., & Hudson, P. 2006. A framework for understanding the development of organisational safety culture. Safety Science, 44: 551–562.

Paulk, M. C., Curtis, B., Chrissis, M. B., & Weber, C. V. 1993. Capability maturity model, version 1.1. IEEE Software, 10(4): 18–27.

Ramachandran, S., Rao, C., Goles, T., & Dhillon, G. 2012. Variations in Information Security Cultures across Professions: A Qualitative Study. Communications of the Association for Information Systems, 33(11): 163-204.

Reason, J.T., 1997. Managing the Risks of Organizational Accidents. Ashgate, Aldershot.

Reid, R., & Van Niekerk, J. 2014. Towards an Education Campaign for Fostering a Societal, Cyber Security Culture. Proceedings of the Eighth International Symposium on Human Aspects of International Security & Assurance (HAISA).

Schein, E. H.  1985. Organizational culture and leadership. San Francisco, CA: Jossey-Bass.

Schein, E. H. 1996. Three cultures of management: The key to organizational learning. MIT Sloan Management Review, 38(1): 9.

Thomson, K. & Von Solms, R. 2006. Towards an Information Security Competence Maturity Model. Computer Fraud & Security, 5: 11-15.

The University of Queensland. 2008. Minerals Industry Risk Management Maturity Chart. Brisbane, Australia: University ofQueensland Minerals Industry Health & Safety Centre.

Van Niekerk, J. F., & Von Solms, R. 2010. Information security culture: A management perspective. Computers & Security, 29(4): 476–486.

Von Solms, R., & Van Niekerk, J. 2013. From information security to cyber security. Computers & Security, 38: 97-102.

Westrum, R., 1993. Cultures with requisite imagination. In: Wise, J.A., Hopkin, V.D., Stager, P. (Eds.), Verification and Validation of Complex Systems: Human Factors Issues: 401-416. New York: Springer-Verlag.

Whitman, M.E., & Mattord, H.J. 2009. Principles of information security, 3rd ed. Kennesaw State University: Thompson Course Technology.

Whitman, M.E., & Mattord, H.J. 2003. Principles of Information Security. Kennesaw State University: Thomson Course Technology.

Young, W., & Leveson, N. 2013. System Thinking for Safety and Security. Proceedings of the 2013 Annual Computer Security Applications Conference (ACSAC 2013).