Understanding The Threats To Information Security

7,554 views
7,295 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
7,554
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
122
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Understanding The Threats To Information Security

  1. 1. ARTICLE IN PRESS International Journal of Information Management 24 (2004) 43–57 In defense of the realm: understanding the threats to information security Michael E. Whitman* Computer Science and Information Systems Department, Kennesaw State University, 1000 Chastain Road MS 1101, Kennesaw, GA 30144, USA Abstract The popular press is replete with information about attacks on information systems. Viruses, worms, hackers, and employee abuse and misuse have created a dramatic need for understanding and implementing quality information security. In order to accomplish this, an organization must begin with the identification and prioritization of the threats it faces, as well as the vulnerabilities inherent in the systems and methods within the organization. This study seeks to identify and rank current threats to information security, and to present current perceptions of the level of severity these threats present. It also seeks to provide information on the frequency of attacks from these threats and the prioritization for expenditures organizations are placing in order to protect against them. The study then will compare these findings with those of previous surveys. r 2004 Elsevier Ltd. All rights reserved. 1. Introduction Routinely the press publishes dramatic reports of millions of dollars lost to computer theft, fraud, and abuse. Recent attacks on America’s infrastructure have highlighted a dramatic need for security (Anonymous, 2001). The 2003 Computer Security Institute/Federal Bureau of Investigation (CSI/FBI) survey on Computer Crime and Security Survey found that 90% of respondents (primarily large corporations and government agencies) detected computer security breaches within the last 12 months. The report documented that of the 75 percent of respondents acknowledging financial losses due to computer breaches the total financial losses dropped from approximately $455.8 million in 2002 to approximately $201.8 million in 2003. Respondents citing their Internet connections as a frequent point of attach rose from 74% in 2002 to 78% in 2003 (Richardson, 2003). *Tel.: +1-770-423-6005. E-mail address: mwhitman@kennesaw.edu (M.E. Whitman). 0268-4012/$ - see front matter r 2004 Elsevier Ltd. All rights reserved. doi:10.1016/j.ijinfomgt.2003.12.003
  2. 2. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 44 ‘‘Information security continues to be ignored by top managers, middle managers, and employees alike. The result of this neglect is that organizational systems are far less secure than they might otherwise be and that security breaches are far more frequent and damaging than is necessary’’ (Straub & Welke, 1998). Security is a careful balance between information safeguard and user access (McFadden, 1997). Unfortunately, advances in computer security have been unable to keep pace with advances in computing in general. In order for an individual to represent a threat to information security an individual must possess the skills, knowledge, resources, authority, and motive to commit computer crime (Parker, 1998, p. 16). Without these characteristics, an individual would normally not be a serious threat to information. However, in this modern threat environment, the prominence of automated attack tools gives rise to a generation of ‘‘script kiddies’’, individuals without extensive personal skills or knowledge, who use these automated tools to attack systems. These tools are pervasive, found in the possession of legitimate ‘‘tiger teams’’, employees, and hackers alike (Boulanger, 1998). In order to strengthen the security postures of organizational systems, systems administrators and managers must begin with an understanding of the threats facing those systems, and then must examine the vulnerabilities inherent in those systems subject to those threats. This study addresses the first part of this strategy by attempting to identify the dominant threats facing organizational information security, and by ranking those threats to allow organizations to direct priorities accordingly. 2. Previous studies Most of the studies examined in related fields were categorized as studies of computer abuse or computer ethics. 2.1. Computer ethics Several studies have examined national and international perspectives on computer ethics (Cheng, Sims, & Teegen, 1997; Harrington, 1995; Paradice & Dejoie, 1991; Whitman, Townsend, & Hendrickson, 1999; Whitman, Townsend, & Aalberts, 2000, among others). Most found that ethics were not as clearly defined as previously believed. The international studies reinforced the pre-conception that individuals from differing origins had somewhat different perspectives with regard to computer use ethics (i.e. Whitman et al., 1999; Whitman, Townsend, Hendrickson, & Fields, 1998). As ethics are a result of one’s environmental upbringing, it is important to stress to organizations conducting business internationally, to use caution in placing expectations on the ethical performance of expatriates and foreign nationals without first trying to understand their perspectives, and educating them as to the organization’s perspectives. Similarly, organizations operating outside the boundaries of the US and its territories must use caution in dealing with local customs and codes of ethics. 2.2. Computer abuse Some studies have documented actual and potential system losses (Hoffer & Straub, 1989; Loch, Carr, & Warkentin, 1992; Parker, 1998; Straub & Welke, 1998, among others). Institutional
  3. 3. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 45 sponsors of high profile studies include: the US Government (Colton, Tien, Tvedt, Dunn, & Barnett, 1982; Ernst & Whinney, 1990; Kusserow, 1983), Ernst & Young (Davis, 1997), PriceWaterhouseCoopers (Hulme, 2001), and the Annual CSI/FBI’s Computer Crime and Security Study (Richardson, 2003). Baskerville (1993) also notes studies of computer abuse (Bloombecker, 1990; Whiteside, 1978), computer viruses (Fites, Johnston, & Kartz, 1989; Hruska, 1990), and illegitimate computer hacking or cracking (Hafner & Markoff, 1991; Landreth, 1989; Stoll, 1989). In 1990, Straub (1990) reported ‘‘prior studies, which include surveys of the victims of computer abuse (Ernst & Whinney, 1990; Kusserow, 1983; Wong, 1985), and conceptual studies on the effectiveness of security countermeasures (Madnick, 1978; Martin, 1973)’’. Straub (1990) also examined such questions as ‘‘has IS security been effective in lowering computer abuse through deterrents?’’ and ‘‘can rival explanations, including use of preventative security software explain lower incidence of computer abuse?’’ finding that deterrence does in fact result in lower levels of computer abuse. In a similar study, Straub and Nance (1990) asked the questions ‘‘How is computer abuse discovered in organizations?’’, ‘‘How are offenders identified?’’ and ‘‘How are identified computer abusers disciplined?’’ Their study found that computer abuse and offenders were discovered by accidental discovery, normal systems controls, and purposeful investigations (or detection). The study also found that only nine percent of all abuses were reported to external agencies. This rate of report was consistent with other studies of the time (August, 1983; Leinfuss, 1986). However, the 2002 CSI/FBI study finds a general increase in the reporting of computer intrusions to law enforcement over the last 7 years. In 1996, 16 percent were reported; 17 percent in 1997 and 1998; 32 percent in 1999; 25 percent in 2000; 36 percent in 2001; and only 34 percent in 2002 (responses to the study were for actions over the previous 12 months) (Richardson, 2003). As to why they did not report these incidences for the 2002 study, 90 percent of respondents feared negative publicity, 75 percent felt that competitors would use this information to an advantage, 54 percent claimed they were unaware that they could report the incidences, and 64 percent felt that civil remedies seemed best (Richardson, 2003). Until very recently, IT executives identified security as an important issue in Information System (Brancheau & Wetherbe, 1987; Niederman, Brancheau, & Wetherbe, 1991, among others) but only once have they identified security as a top 10 issue (Straub & Welke, 1998). Similarly, IS executives reportedly dropped security as a top 20 issue in 1995, suggesting that either they felt sufficient controls were implemented, or that they no longer felt it was as significant as other issues (Zviran & Haga, 1999). The American Institute of Certified Public Accountants (AICPA), however, clearly identified Information Security and Controls as the number one Top Ten Technology issues for 2001 (Cryton & Tie, 2001). 3. Research methodology and objectives In their study of threats to information security, Loch et al. (1992) identified and examined categories of threats to information security. This study seeks to reiterate the general questions first posed by Loch et al. (1992) in their study and extend their examination of threat categories. The primary research questions posed are: (1) What are the threats to information security?
  4. 4. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 46 Table 1 Threats to information security Act of human error or failure (accidents, employee mistakes) Compromises to intellectual property (piracy, copyright infringement) Deliberate acts of espionage or trespass (unauthorized access and/or data collection) Deliberate acts of information extortion (blackmail of information disclosure) Deliberate acts of sabotage or vandalism (destruction of systems or information) Deliberate acts of theft (illegal confiscation of equipment or information) Deliberate software attacks (viruses, worms, macros, denial of service) Forces of nature (fire, flood, earthquake, lightning) Quality of service deviations from service providers (power and WAN Quality of Service issues) Technical hardware failures or errors (equipment failure) Technical software failures or errors (bugs, code problems, unknown loopholes) Technological obsolescence (antiquated or outdated technologies) (2) Which of these threats is the most serious? Reflective of the emphasis on quantifying the frequency and impact of threats as evidenced in the popular CSI/FBI study (Richardson, 2003), the final two research questions of interest are: (3) How frequently (per month) are these threats observed? (4) Which threats require the highest expenditures? In order to identify the threats to be assessed, the study began by reviewing previous works on information security. At the same time, three Chief Information Security Officers (CISOs) were interviewed, and the resulting interviews scrutinized for examples of threats. This data gathering process resulted in a list of 215 candidate threats. Three researchers knowledgeable in information security independently reviewed the resulting list of threats and consolidated the list into collections of threat categories. A comparison of researcher lists and subsequent revision resulted in a consolidated list of threat categories and reviewed for duplication, omission or superfluous entries. The resulting comments were incorporated into the survey instrument (see Table 1). Questions on security protection mechanisms, the Internet usage, organizational changes, and basic organizational demographics were added. The survey was published as an online Web document and a letter of invitation to participate in the study was mailed to 1000 top computing executives randomly selected from the Directory of Top Computing Executives, a method used in previous studies (Loch et al., 1992; Segars & Grover, 1999). The study produced 192 total usable responses representing a response rate of 21.5%. 3.1. Method biases A preliminary analysis of the results looked for indications of response, non-response and common method biases. The response rate, while low, was expected considering the relatively sensitive nature of the subject. Organizations have long been reluctant to indicate that they may have vulnerabilities. An examination of non-response bias indicates that there are an inordinate number of responses from the educational sector and the government/military sector, compared to the original sample. While this demands caution in the interpretation of the results and the
  5. 5. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 47 generalizability of the findings, the study still provides valuable information on the types of threats found, the perceived seriousness of these threats, the frequency of their attacks, and expenditures needed to address them. An examination of the responses on the basis of job titles, organizational size, and the firms’ primary business resulted in no significant differences. Common method bias is a bias associated with single self-reported measures. Use of single sources of information could have introduced spurious relationships among the variables. The study variables were collected with the same method: a self-report scale. A Harmon’s one-factor test (Podsakoff & Organ, 1986) was performed on the threats listings, and attacks listings independently and combined. Should the factor analyses indicate a single factor, one could project that common method bias may be influencing the results (Cote & Buckley, 1987; Igbaria, Zinatelli, Cragg, & Cavaye, 1997). In each case, three or more factors were identified with eigenvalues greater than 1.0. As a result common method bias was not suspected. 4. Findings Respondents were predominantly IS/IT directors, managers, or supervisors (60 percent); executive IS managers (CIOs, CTO, or Exec VP) (23 percent) or Technology VPs (Corporate Management) (eight percent) as was the intent of the mailing. The remaining respondents were six percent IS/IT staff, or three percent other IS/IT management. Respondents represented a wide range of organizational sizes with 21 percent representing organizations with more than 5000 employees; eight percent representing those between 2501 and 5000; 17 percent between 1001 and 2500; 20 percent between 501 and 1000; 28 percent between 101 and 500, and six percent under 100 employees. Respondents by primary business are presented in Table 2. Table 2 Responses by primary business Firm’s primary business Percentage (%) Business/professional services 7.3 Education 28.1 Finance/banking/insurance/real 4.2 Estate/investing 0.0 Government/military 12.5 Health care/hospital/medical 4.2 Hospitality/entertainment 0.0 Legal 2.1 Manufacturing—computer 0.0 Manufacturing—not computer 17.7 Media/marketing/advertisement 4.2 Processing: mining/oil/construction 4.2 Retail/wholesale 2.1 Service/communications provider 1.0 Transportation/aerospace 3.1 Utility 1.0 Other 7.3 No response 1.0
  6. 6. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 48 Table 3 Threat protection mechanisms employed in respondents organizations (multiple responses possible) Protection mechanisms Yes (%) No (%) Use of passwords 100 0 Media backup 97.9 2.1 Employee education 89.6 10.4 Consistent security policy 62.5 37.5 Use internally developed software only 4.2 95.8 Virus protection software 97.9 2.1 Audit procedures 65.6 34.4 Encourage violations reporting 51.0 49.0 No internal Internet connections 6.3 93.8 Use shrink-wrap software only 9.4 90.6 No outside network connections 4.2 95.8 No outside dialup connections 10.4 89.6 No outside web connections 2.1 97.9 Firewall 61.5 38.5 Host intrusion detection 31.3 68.8 Network intrusion detection 33.3 66.7 Auto account logoff 50.0 50.0 Publish formal standards 43.8 56.3 Monitor computer usage 45.8 54.2 Control of workstations 40.6 59.4 Ethics training 30.2 69.8 With most organizations providing some form of information via the Internet, there could be an exposure of information to potential crime, abuse or misuse. Many of these organizations use the Internet to support internal operations, increasing the risk of substantial loss through unauthorized disclosure, modification or destruction of information. An importance aspect of Information Security in organizations is the employment of protection mechanisms as controls against the intrusion of unauthorized access, and against accidental or intentional crime, misuse and abuse. The most common protection mechanism employed is the use of username/password access controls with 100 percent of respondents indicating usage. What is not indicated in this study is the ‘‘strength’’ of these passwords, nor the diligence of the organizations in requiring password changes. The results are presented in Table 3. 4.1. Threats to information security The central data of interest was the ranking of threats to information security. This ranking was calculated in two stages. First, respondents were asked to rate each individual threat on a scale of 1–5 with 5 indicating an extremely significant threat and 1 indicating a non-significant threat. These means and standard deviations are presented in columns B and C of Table 4, respectively. Next, respondents were asked to identify and rank the top five threats. This is consistent with prior studies involving rating and ranking surveys in information security (Loch et al., 1992; Segars & Grover, 1999). These rating were used to calculate a weighted ranking of the values and
  7. 7. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 49 Table 4 Weighted ranks of threats to information security A. Threat B. Mean C. Std. dev D. Weight E. Weighted rank 1. Deliberate software attacks 3.99 1.03 546 2178.3 2. Technical software failures or errors 3.16 1.13 358 1129.9 3. Act of human error or failure 3.15 1.11 350 1101.0 4. Deliberate acts of espionage or trespass 3.22 1.37 324 1043.6 5. Deliberate acts of sabotage or vandalism 3.15 1.37 306 962.6 6. Technical hardware failures or errors 3.00 1.18 314 942.0 7. Deliberate acts of theft 3.07 1.30 226 694.5 8. Forces of nature 2.80 1.09 218 610.9 9. Compromises to intellectual property 2.72 1.21 182 494.8 10. Quality of service deviations from service providers 2.65 1.06 164 433.9 11. Technological obsolescence 2.71 1.11 158 427.9 12. Deliberate acts of information extortion 2.45 1.42 92 225.2 are presented in column D of Table 4. Weights were calculated by assigning 5 points for a ‘‘top rank’’ vote, 4 points for a ‘‘second rank’’ vote, etc. to 1 point for a ‘‘fifth rank’’ vote. The results of the weighted rank are presented in Table 4. When compared to the popular CSI/FBI study, the findings are similar. The top threat for over 8 years, virus attacks, and the fourth ranked threat, denial of service, are synonymous with the deliberate software attacks category of this study. The second most prevalent threat in the 2003 CSI study, laptops, does not have a corresponding value in this study, except as deliberate acts of theft, or acts of human error or failure (failure to follow policy in systems use). The third most prevalent CSI threat in 2003, insider abuse of net access, is closely related to this study’s third ranked act of human error or failure, which encompasses both mistakes and intentional abuses of systems. The fifth and sixth CSI threats are combined comparable to this study’s deliberate acts of espionage or trespass (trespass within the confines of a system). The comparisons are similar throughout the studies. When compared to the findings of the Loch et al. (1992) study presented in Table 5, respondents have reduced the overall recognition of threats emanating as natural disaster (forces of nature) to much lower in the listings. These types of threats are not even listed in the CSI/FBI study. The accidental act of internal employees (acts of human error or failure) maintains a high priority as a valid threat. Problems with implemented systems controls as preventive measures has dropped in priority, perhaps due to a real or false sense of security as organizations begin to focus more resources on securing their systems. Threats associated with viruses and other deliberate software attacks have increased, whether due to the real danger these attacks represent, or possibly due to the increased media and vendor communications about the presences of such threats. 4.2. Attacks Table 6 presents the numbers of attacks per month as observed by respondents. Of particular interest is the presence of deliberate acts of information extortion, representing the first reported
  8. 8. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 50 Table 5 Loch et al. threat rankings weighted vote methods (Loch et al., 1992) Threats Ranking Weighted votes Natural disasters 1 324 Accidental entry bad data by employees 2 270 Accidental destruction data by employees 3 252 Weak/ineffective controls 4 149 Entry of computer viruses 5 128 Access to system by hackers 6 123 Inadequate control over media 7 96 Unauthorized access by employees 8 93 Poor control of I/O 9 67 Intentional destruction data by employees 10 58 Intentional entry bad data by employee 11 36 Access to system by competitors 12 31 Other threats 13 10 Total 1637 Table 6 Numbers of attacks per month as reported by respondents o 10 Number of attacks per month None 10–50 51–100 > 100 No answer (%) (%) (%) (%) (%) (%) 1. Act of human error or failure 24.0 41.7 14.6 2.1 5.2 12.5 2. Compromises to intellectual property 61.5 25.0 3.1 2.1 1.0 7.3 3. Deliberate acts of espionage or trespass 68.8 20.8 3.1 3.1 4.2 4. Deliberate acts of information extortion 90.6 8.3 1.0 5. Deliberate acts of sabotage or vandalism 64.6 31.3 3.1 1.0 6. Deliberate acts of theft 54.2 38.5 7.3 7. Deliberate software attacks 16.7 47.9 14.6 9.4 11.5 8. Forces of nature 62.5 34.4 2.1 1.0 9. Quality of service deviations from service providers 46.9 43.8 8.3 1.0 10. Technical hardware failures or errors 34.4 51.0 11.5 3.1 11. Technical software failures or errors 30.2 45.8 18.8 5.2 12. Technological obsolescence 60.4 21.9 15.6 1.0 1.0 Average responses 50.9 35.4 8.3 2.1 1.8 1.5 incidences of this unique threat. Although the threat itself was only ranked 12th, the acknowledgment of the occurrence indicates that it is a threat to be considered seriously. Of particular note is the large numbers of ‘‘attacks’’ by human error or failures. As is evident, employee mistakes and failures to follow policy represent a dominant threat requiring the implementation of controls to reduce the frequency and severity of such attacks. Unfortunately, missing, inadequate or incomplete controls, and organizational policy or planning attacks both
  9. 9. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 51 Table 7 Weighted ranking of top threat-driven expenditures Top threat-driven expenses Weight 1. Deliberate software attacks 560 2. Act of human error or failure 334 3. Technical software failures or errors 306 4. Technical hardware failures or errors 264 5. Quality of service deviations from service providers 216 6. Deliberate acts of espionage or trespass 208 7. Deliberate acts of theft 182 8. Deliberate acts of sabotage or vandalism 176 9. Technological obsolescence 144 10. Forces of nature 134 11. Compromises to intellectual property 96 12. Deliberate acts of information extortion 44 hamper these efforts. Equally noteworthy is the frequency of attacks associated with technical failures or errors of hardware and software, as well as the problems recognized by quality of service deviations from service providers. Service providers of both network communications and power can have a dramatic effect on the availability and security of information. The final set of questions examines the expenditure priorities the respondents dealt with. Each respondent was asked to rank the threats presented earlier as to the top five threat-driven expenditures. Again a weighted ranking was created using 5 points for a first place vote, and so forth. The results are presented in Table 7. With the exception of the top expense addressing deliberate software attacks, the prominent expenditures attempted to address unintentional threats. Respondents indicated that human mistakes, technical problems and service-provider problems were the dominant expenditures. 5. Conclusions As the evidence supports, the threats to information security are real. The results of this study, as supported by the CSI/FBI annual Computer Crime and Security Survey (Richardson, 2003), clearly illustrate the need for increased levels of awareness, education and policy in information security. These findings also support the findings of Loch et al. (1992) who state ‘‘respondents seemed well aware of the threats y but viewed their neighbors to be at more risk than they’’. The findings also paralleled the findings with regard to the rankings of threats, with minor variations, indicating that the same threats that faced senior IS managers over 10 years ago, are still prevalent today. It is of the utmost importance that IS administration addresses the internal and external, the intentional and unintentional acts that represent these threats. Administration must address these acts through (a) policy, (b) security mechanisms (controls), and (c) education, training, and awareness programs.
  10. 10. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 52 5.1. The need for policy Security advocates emphasize that any security profile begin first with valid security policy (Bergeron & Berube, 1990; Clyde, 1999; Whitman et al., 1999; Wood, 1999, among others). ‘‘If you ask a computer security professional what the single most important thing you can do to protect your network is, they will unhesitatingly say that it is to write a good security policy’’ (Wadlow, 2000). This policy is then translated into action through an effective security plan focusing on the prevention, detection, and correction of threats (Wilson, Turban, & Zviran, 1992). According to the Cutter Consortium (Cutter Consortium, 1999), in 1999 over 20% of responding businesses had no security policy, and almost 14% had no plans to develop one in the near future. A recent study identified implementation of security policy as the number one recommended action an organization can and should take to safeguard its systems (Sword & Shield, 2001). The security policy is also perhaps the least expensive piece of the security strategy in terms of real costs, as it simply requires time and effort to draft, implement and maintain. In general, a good security policy should outline individual responsibilities, define authorized and unauthorized uses of the systems, provide venues for employee reporting of identified or suspected threats to the system, define penalties for violations, and provide a mechanism for updating the policy. Good policies are specific to an organization and its systems, but contain many commonalties. The base security policy for an organization serves as an umbrella for other subordinate policies like fair and responsible use policies for the Internet, organizational computing equipment, password policies, and other guidelines and standards used to enforce certain actions and activities among employees. ‘‘Policies are management instructions indicating a course of action, a guiding principle, or an appropriate procedure which is expedient, prudent or advantageous. Policies are high-level statements that provide guidance to workers y. Policies are mandatory and can be thought of as the equivalent of an organization-specific law’’ (Wood, 1999). As such policies are viewed as the starting point for information security, as they define the tone, and posture management wishes to set for the overall security profile of an organization. 5.2. The need for awareness Once the policy is developed, it must be disseminated, understood, and agreed to (Whitman et al., 2000). There is an obvious need for increased awareness of the threats to information security not only among security and systems administrators, but also among the users of information in organizations. Some awareness is forwarded through professional organizations such as the ACM (www.acm.org) through formal code of ethics members are expected to follow (Harrington, 1996). Unfortunately, end-users are unlikely to be involved in these organizations, thus requiring IS administration to provide its own distributed awareness program to increase the users’ cognizance of the threats and penalties associated with information use. The rapid rise of threats from viruses, worms and the like has illustrated the need for increased awareness by users. Many organizations ravished by these e-mail attachments indicated that e- mail usage policies were present; however, employees either failed to read and follow these policies, or related anti-virus policies (Hulme, 2001). Policies are obviously ineffective if not implemented and enforced.
  11. 11. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 53 5.3. The need for information security education Another recognised need is to combat the threat to information security through education (Chin, Irvine, & Frinke, 1997). Education is needed in the preparation of future employees to work in a secure and ethical computing environment, and in preparing technology students in the recognition of threats and vulnerabilities present in existing systems. An educational system that cultivates an appropriate knowledge of computer security will increase the likelihood that the next generation of IT workers will have the background needed to design and develop systems that are engineered to be reliable and security (Irvine et al., 1998). The need is so great that the National Security Agency established Centers of Academic Excellence in Information Assurance (IA), an outreach program designed and operated by the National Security Agency in the spirit of Presidential Decision Directive 63, the Policy on Critical Infrastructure Protection, May 1998 (NSA, 2003). The program goal is to reduce vulnerabilities in our National Information Infrastructure by promoting higher education in IA, and producing a growing number of professionals with IA expertise in various disciplines. A number of universities claim to provide coursework in information security. However, a study of public academic institutions found only 63 universities (Rubin, 2001) offering approximately 104 courses in Computer or Information Security. Of these courses, almost 62% were actually courses in cryptography, cryptology, and/or mathematics. The need for security coursework is evident. Existing published curriculum models are frequently designed for graduate courses (Irvine, 1996; Irvine et al., 1998), computer science, and engineering-specific programs (Chin et al., 1997; Vaughn & Bogess, 1999), or as pure training programs (ASIS, 2001; NIST, 1989, 1998). Part of the security education needed falls in the arena of education and training of programmers in the development of secure systems from the analysis phase. It is this requirement to integrate security into the systems development life cycle that offers the first opportunity to introduce security into systems (Baskerville, 1993; Hayam & Oz, 1993). 5.4. Deterrence of computer crime and abuse Experts report that deterrence constraints that are emplaced to provide disincentives were traditionally based on the ‘‘certainty of a sanction and the severity of those sanctions. Certainty of sanction, moreover, is comprised of two risk components: probability of detection (apprehension) and probability of punishment given detection’’ (Blumstein, 1978; Straub, Carlson, & Jones, 1993). With regard to the threats to information security, deterrence as a control best applies to intentional acts designed to disrupt computer operations, and access and acquire information without authorization. It serves little to deter accidental acts, best served by increased employee training and awareness. It also only serves to deter those acts by weakly motivated individuals who consider illegal or unethical behavior (Leming, 1978; Straub et al., 1993; Straub & Widom, 1984), or who are ignorant of the ethical standards, (Hoffer & Straub, 1989). The result is that deterrence is most effective when the subject has (1) a realistic expectation of apprehension, (2) a fear of the corresponding sanctions, and (3) a reasonable expectation of the implementation of sanctions. Of those who represent active threats to information security (not acts of nature, or interruptions from service providers), those external to the organization may have no expectation of apprehension. Of those who represent active threats to information security internal to the
  12. 12. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 54 organization may no have either a fear of the sanctions or a reasonable expectation of the implementation of those sanctions. Straub et al. (1993) also advocates the implementation of strong deterrence measures with the rewarding of positive behavior. In any case, when deterrence measures fail, the only remaining option is the hardening of systems against these threats, through countermeasures known as ‘‘preventatives consisting of measures such as physical security as well as software security’’ (Gopal & Sanders, 1997; Straub, 1990). As indicated earlier, this study found that deterrents do in fact help reduce levels of abuse, and that those deterrents and preventatives that were most effective included using multiple methods to disseminate information about penalties and acceptable systems usage, statements of penalties for violations, and use of security software. Caution is advised, however, as at least one study has found that the additional of preventive controls can impede business activity, as software with excessive preventive measure can make the applications burdensome and unwieldy to use (Gopal & Sanders, 1997). 5.5. Limitations of the study Of obvious concern is the low response rate associated with this study. This was actually projected based on the sensitive nature of the subject matter. Organizational technology executives equate acknowledgment of security weaknesses with acknowledgment of failure of professional responsibility. However, due to the incredible pace of the identification of faults in modern operating systems, hardware components, and the increased prevalence of direct attacks on organizational systems, admission of vulnerability is more of an acceptance of a growing problem, than an admission of failure. A related shortcoming with this study is the dominant percentage of public sector respondents to private sector respondents. While the mailing was designed to be random, and aimed at potential respondents across both public and private sectors, proportionally fewer respondents from industries other than education and government/military- related responded. While unexpected, it is certainly understandable. Employees and executives in the public sector are inherently more open about their operations, problems, and functions. This could be a result of freedom of information acts, state and local open systems requirements, or a number of other requirements that those agencies that operate on externally funded budgets, rather than internal profits are less concerned with shaking public trust in their systems, and more concerned with improving those systems through open dialog and interchange of ideas. While purely supposition, it seems to fit the profile of public-versus-private priorities. 5.6. Summation The threat is real, the stakes are high, and the systems protecting the target information are difficult to protect. As Loch et al. reiterate ‘‘results suggest that management needs to (1) become more informed of the potential for security breaches y (2) increase their awareness in key areas, y and (3) recognize that their overall level of concern for security may underestimate the potential risk inherent in the highly connected environment in which they operate’’ (Loch et al., 1992). The best efforts of well-meaning systems and security professionals could be in vain if they fail to follow the prophetic words of the famous Chinese General Sun Tzu ‘‘Know the enemy and know yourself; in a hundred battles you will never be in peril. When you are ignorant of the enemy
  13. 13. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 55 but know yourself, your chances of winning or losing are equal. If ignorant both of your enemy and yourself, you are certain in every battle to be in peril’’ (Tzu, 1988). References Anonymous (2001). Chinese hackers call truce in China-US Cyberwar. USAToday. WWW Document, accessed 1/10/ 2004, http://www.usatoday.com/tech/news/2001-05-09-chinese-hackers-truce.htm. ASIS, (2001). Professional Development. American Society for Industrial Security, WWW Document, accessed 09/12/ 2001, http://www.asisonline.org/education/index/xml. August, R. S. (1983). Turning the computer into a criminal. Barrister, 13. Baskerville, R. (1993). Information systems security design methods: Implications for information systems development. ACM Computing Surveys, 25(4), 375–414. Bergeron, F., & Berube, C. (1990). End users talk computer policy. Journal of Systems Management, 41(12), 14–32. Bloombecker, B. (1990). Spectacular computer crimes: What they area and how they cost american business half a billion dollars a year. Homewood, IL: Dow Jones-Irwin. Blumstein, A. (1978). Introduction. In Blumstein, A. C., & D. Nagin (Eds.), Deterrence and Incapacitation: Estimating the Effects of Crime Sanctions on Criminal Rates. Report of the Panel of Deterrence and Incapacitation, National Academy of Sciences, Washington, DC2. Boulanger, A. (1998). Catapults and grappling hooks: The tools and techniques on information warfare. IBM Systems Journal, 37(1), 106–114. Brancheau, J., & Wetherbe, J. C. (1987). Key issues in information systems: 1986. MIS Quarterly, 11(1), 23–45. Cheng, H. K., Sims, R., & Teegen, H. (1997). To purchase or to pirate software: An empirical study. Journal of Management Information Systems, 13(4), 49–60. Chin, S.-K., Irvine, C., & Frinke, D. (1997). An information security education initiative for engineering and computer science. In Naval postgraduate school technical report npscs-97–003. Monterey, CA: Naval Postgraduate School. Clyde, R. A. (1999). Enterprise security—built on sound policies. Enterprise Systems Journal, December, 69–72. Colton, K. W., Tien, J. M., Tvedt, S., Dunn, B., & Barnett, A. R. (1982). Electronic fund transfer systems and crime. Washington, DC: US Department of Justice, Bureau of Justice Statistics. Cote, J., & Buckley, M. R. (1987). Estimating trait, method, and error variance: Generalizing across 70 construct validation studies. Journal of Marketing Research, 24, 315–318. Cryton, S. H., & Tie, R. (2001). A CPA’s guide to the top issues in technology. Journal of Accountancy, 191(5), 71–77. Cutter Consortium (1999). Nearly 20% of companies have no formal IT security policy or standard. Cutter Consortium, WWW Document, accessed 06/12/2003, http://www.cutter.com/press/990907.html. Davis, B. (1997). Security survey: Is it safe? Information Week. Ernst, & Whinney (1990). Ernst & Whinney 1989 computer security survey report. Pamphlet. Ernst & Young, 1400 Pillsbury Center, Minneapolis, MS 55402. Fites, P., Johnston, P., & Kartz, M. (1989). The computer virus crisis. New York, NY: Van Nostrand Reinhold. Gopal, R. D., & Sanders, G. L. (1997). Preventive and deterrent controls for software piracy. Journal of Management Information Systems, 13(4), 29–47. Hafner, K., & Markoff, J. (1991). Cyberpunk: Outlaws and hackers on the computer frontier. New York, NY: Simon and Schuster. Harrington, S. J. (1995). Computer crime and abuse by is employees: Something to worry about? Journal of Systems Management, 46(2), 6–11. Harrington, S. J. (1996). The effects of codes of ethics and personal denial of responsibility on computer abuse judgments and intentions. MIS Quarterly, 20(3), 257–278. Hayam, A., & Oz, E. (1993). Integrating data security into the systems development life cycle. Journal of Systems Management, 44(8), 16–20. Hoffer, J. A., & Straub Jr., D. W. (1989). The 9 to 5 underground: Are you policing computer crimes? Sloan Management Review, 30(4), 35–42. Hruska, J. (1990). Computer viruses and anti-virus warfare. New York, NY: Ellis Horwood.
  14. 14. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 56 Hulme, G. V. (2001). Security policies: How much is enough? Information Week. Igbaria, M., Zinatelli, N., Cragg, P., & Cavaye, A. (1997). Personal computing acceptance factors in small firms: A structural equation model. MIS Quarterly, 21(2), 279–305. Irvine, C. E. (1996). Goals for Computer Security Education. Proceedings of the IEEE Symposium on Security and Privacy, May, 24–25. Irvine, C. E., Chin, S.-K., & Frincke, D. (1998). Integrating Security into the Curriculum. IEEE Computer, December, 25–30. Kusserow, R. P. (1983). Computer-related fraud and abuse in government agencies. Washington, DC: US Department of Health and Human Services. Landreth, B. (1989). Out of the inner circle: The true story of a computer intruder capable of cracking the nation’s most secure computer systems. Redmond, WA: Tempus. Leinfuss, E. (1986). Computer crime: How deep does it go. MIS Week, 41(7). Leming, J. S. (1978). Cheating behavior, situational influence, and moral development. Journal of Educational Research, 71, 214–217. Loch, K. D., Carr, H. H., & Warkentin, M. E. (1992). Threats to information systems: Today’s reality, yesterday’s understanding. MIS Quarterly, 16(2), 173–186. Madnick, S. (1978). Management policies and procedures needed for effective computer security. Sloan Management Review, Fall, 61–74. Martin, J. (1973). Security accuracy and privacy in computer systems. Englewood Cliffs, NJ: Prentice Hall. McFadden, P. J. (1997). Guarding computer data. Journal of Accountancy, 184(1), 77–79. Niederman, F., Brancheau, J. C., & Wetherbe, J. C. (1991). Information systems management issues for the 1990s. MIS Quarterly, 15(4), 475–495. NIST, (1989). Computer security trainings guidelines (special publication 500-172). National Institute of Standards and Technology Computer Security Resource Center. NIST, (1998). Information technology security training requirements: A role- and performance-based model (special publication 800-16). National Institute of Standards and Technology Computer Security Resource Center. NSA (2003). Centers of Academic Excellence in Information Assurance Education. WWW Document, accessed 04/10/ 2003, http://www.nsa.gov/isso/programs/coeiae/index.htm. Paradice, D. B., & Dejoie, R. M. (1991). The ethical decision-making processes of information systems workers. Journal of Business Ethics, 10(1), 1–22. Parker, D. B. (1998). Fighting computer crime. New York, NY: Wiley. Podsakoff, P. M., & Organ, D. W. (1986). Self-reports in organizational research: Problems and prospects. Journal of Management, 12(4), 531–544. Richardson, R. (2003). 2003 CSI/FBI Computer Crime and Security Survey. WWW Document, accessed 10/15/2003, http://www.gocsi.com. Rubin, A. (2001). List of crypto and security courses by state. Avi Rubin. WWW Document, viewed 2/2/2002. http://avirubin.com/courses.html. Segars, A. H., & Grover, V. (1999). Profiles of strategic information systems planning. Information Systems Research, 10(3), 199–232. Stoll, C. (1989). The cuckoo’s egg: Tracking a spy through the maze of computer espionage. New York, NY: Doubleday. Straub Jr., D. W. (1990). Effective is security: An empirical study. Information Systems Research, 1(3), 255–276. Straub Jr., D. W., Carlson, P. J., & Jones, E. H. (1993). Deterring cheating by student programmers: A field experiment in computer security. Journal of Management Systems, 5(1), 33–48. Straub Jr., D. W., & Nance, W. D. (1990). Discovering and disciplining computer abuse in organizations. MIS Quarterly, 14(1), 45–60. Straub Jr., D. W., & Welke, R. J. (1998). Coping with systems risk: Security planning models for management decision making. MIS Quarterly, 22(4), 441–469. Straub Jr., D. W., & Widom, C. S. (1984). Deviancy by bits and bytes: Computer abusers and control measures. In J. H. F. a. E. G. Dougall (Ed.), Computer security: A global challenge (pp. 91–102). Amsterdam: Science Publishers BV North Holland.
  15. 15. ARTICLE IN PRESS M.E. Whitman / International Journal of Information Management 24 (2004) 43–57 57 Sword & Shield. (2001). Top 10 list of proactive security measures. Sword & Shield Security Consultants, WWW Document, accessed 09/15/2001, http://www.sses.net/top10.html. Tzu, S. (1988). The art of war: Translation by Samuel B. Griffith. Oxford: Oxford University Press. Vaughn, R. B., & Bogess III, J. E. (1999). Integration of computer security into software engineering and computer science programs. The Journal of Systems and Software, 49(2,3), 149–153. Wadlow, T. A. (2000). The process of network security. Reading, MA: Addison-Wesley. Whiteside, T. (1978). Computer capers: Tales of electronic thievery, embezzlement, and fraud. Toronto: Fitzhenry and Whiteside. Whitman, M. E., Townsend, A. M., & Aalberts, R. J. (2000). Information systems security and the need for policy. In G. Dhillon (Ed.), Information security management: Global challenges in the next millennium (pp. 9–18). Hershey, PA: Idea Group Publishing. Whitman, M. E., Townsend, A. M., & Hendrickson, A. R. (1999). Cross-national differences in computer-use ethics: A nine country study. The Journal of International Business Studies, 30(4), 673–687. Whitman, M. E., Townsend, A. M., Hendrickson, A. R., & Fields, D. A. (1998). An examination of cross-national differences in computer-related ethical decision making. ACMSIG Computers and Society, 28(4), 22–27. Wilson, J. L., Turban, E., & Zviran, M. (1992). Information systems security: A managerial perspective. International Journal of Information Management, 12(2), 105–119. Wong, K. (1985). Computer crime-risk management and computer security. Computers and Security, 4, 287–295. Wood, C. C. (1999). Information security policies made easy. Sausalito, CA: Baseline Software Inc. Zviran, M., & Haga, W. J. (1999). Password security: An empirical study. Journal of Management Information Systems, 15(4), 161–185. Michael E. Whitman, CISSP is an Associate Professor of Information Systems in the Computer Science and Information Systems Department at Kennesaw State University, Georgia. He earned his doctorate in 1991 from Auburn University and his Certified Information Systems Security Professional Certification in 2002. He is currently the Director of the Master of Science in Information Systems and the founding Director of the KSU Center for Information Security Education and Awareness. He is an active teacher and researcher in areas of Information Security, Policy Development and Ethical Use of Computing, with over 60 publications in journals such as IS Research, Communications of the ACM, Information & Management and Journal of Computer Information Systems. He has authored two texts and a laboratory manual in Information Security available through Course Technology/Thomson Publishing.

×