Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Increasing Sophistication - The Cyberpsychology of Online Fraud and Phishing


Published on

The cybersecurity environment is becoming increasingly aggressive, with Cybercrime as a Service blurring the distinction between Advanced Persistent Threats and minor criminality. Financial institutions need to understand the human factors of the online environments in which both they and their consumers operate. Cybercriminals are growing in sophistication and intelligence, so in order to protect the public, we must understand and appreciate the psychology of the victims of fraud and phishing.

Published in: Economy & Finance
  • How can I improve my memory before an exam? How can I improve my study skills? learn more... ➤➤
    Are you sure you want to  Yes  No
    Your message goes here

Increasing Sophistication - The Cyberpsychology of Online Fraud and Phishing

  1. 1. the cyberpsychology of online fraud Dr Ciarán Mc Mahon Central Bank, AMLD Away Day November 6th, 2015
  2. 2. Introduction • Emerging trends in cybercrime • Architecture of compromise • Victims of online fraud • Psychology of cyberspace • Cybercrime targeting financial institutions
  3. 3. Emerging trends In cybercrime • Advanced persistent threats – You are already hacked • Cybercrime as a service – everyone can be a hacker now • Low-hanging fruit – easier to steal a lamb than a sheep • Blackmail – information is the new money
  4. 4. Europol iOCTA Report 2015 • Cybercrime – remains a growth industry – becoming more aggressive and confrontational – an extremely diverse range of criminality – blurring of the lines between Advanced Persistent Threat (APT) groups and profit-driven cybercriminals
  5. 5. Europol iOCTA Report 2015 • CaaS – Cybercrime as a Service – grants easy access to criminal products and services, enables a broad base of unskilled, entrylevel cybercriminals to launch attacks of a scale and scope disproportionate to their technical capability and asymmetric in terms of risks, costs and profits.
  6. 6. • Mc Afee – ‘The Hidden Data Economy’ economy.pdf • “Software-generated” is a valid combination of a primary account number (PAN), an expiration date, and a CVV2 number that has been generated by software. Sellers refer to a valid number combination as a “Random.” Valid credit card number generators can be purchased or found for free online. • “Fullzinfo” means the seller supplies all of the details about the card and its owner, such as full name, billing address, payment card number, expiration date, PIN number, social security number, mother’s maiden name, date of birth, and CVV2. The Hidden Data Economy
  7. 7. • Data is a key commodity in the digital underground and almost any type of data is of value to someone; whether it can be used for the furtherance of fraud or for immediate financial gain. (Europol iOCTA) recent EU breaches
  8. 8. Europol iOCTA Report 2015 “While it is possible for organisations to invest in technological means to protect themselves, the human element will always remain as an unpredictable variable and a potential vulnerability. As such social engineering is a common and effective tool used for anything from complex multi-stage attacks to fraud. “
  9. 9. PWC The Global State of Information Security Survey 2015 Information Age - Grant Forks Herald biggest-threat-network-security Databarracks Data Health Check - Clearswift - CIO -
  10. 10. Cyberpsychology is an emerging discipline which involves the study of the human mind and behaviour in the context of information communication technology. It represents an incredibly valuable source of insight into information security behaviour. Photo from Project Apollo Archive
  11. 11. • Presence • The internet is designed to make communication effortless, so we should feel totally immersed in it. • A major goal for all ICT engineers is to ensure that users of their technology are totally unaware of all of the computations and calculations that are going on behind the scenes (Lombard & Ditton, 1997). • Users act like ICT is invisible - “for mediated exchange to work as interpersonal communication, there must be tacit agreement that the participants will proceed as though they are communicating face to face” (Cathcart and Gumpert, 1986, p. 116) • Cathcart, R., & Gumpert, G. (1986). The person-computer interaction: A unique source. In B. D. Ruben (Ed.), Information and behavior (vo.l 1) (pp. 113–124). New Brunswick, NJ: Transaction Publishers. • Lombard, M., Ditton, T., & Media, M. (1997). At the heart of it all: The concept of presence. Journal of Computer-Mediated Communication, 3(2), 1–23. • Photo from abstract-colorful-587113/z
  12. 12. •• LurkingLurking • Anywhere up to 90% of the visitors to any online forum will read everything, will be invisible and will not participate to any meaningful or noticeable degree (Nonnecke, East, & Preece, 2001). • Consequently it is very likely that when an employee is online: they may assume that the only ones who they can see talking to them are the only ones who are present. This is where insider threats slip up – they don’t think anyone can see them. • Nonnecke, B., East, K. S., & Preece, J. (2001). Why lurkers lurk. In Americas Conference on Information Systems (pp. 1–10). • Photo from animal-542554/
  13. 13. • Self-disclosure • When online, people are more likely to reveal personal information. • People tend to reveal most personal information online when they are in certain conditions (Joinson, 2001), namely heightened private self- awareness and reduced public self-awareness. • In other words, when someone is focussing on themselves, their person and body, and feels anonymous and unseen, they are likely to reveal information about themselves that they would not in a face-to-face context. • Self-disclosure of this kind likely a critical factor in cyberbullying - it’s also a pretty useful tool in honeypot operations. • Joinson, A. N. (2001). Self-disclosure in computer-mediated communication: The role of self-awareness and visual anonymity. European Journal of Psychological Assessment, 31, 177–192. • Photo from
  14. 14. • Online disinhibition • When online, people loosen up, feel less restrained, and express themselves more openly • Everyday users on the Internet—as well as clinicians and researchers have noted how people say and do things in cyberspace that they wouldn’t ordinarily say and do in the face-to- face world. They loosen up, feel less restrained, and express themselves more openly. So pervasive is the phenomenon that a term has surfaced for it: the online disinhibition effect. (Suler, 2004, p.321) • Suler, J. (2004). The online disinhibition effect. CyberPsychology & Behavior, 7(3), 321–326. • Photo from 731227
  15. 15. • Minimisation of status and authority • In the traditional philosophy of the internet there is no centralised control, everyone is equal, and its only purpose is sharing ideas • While online a person’s status in the face-to-face world may not be known to others and may not have as much impact. Authority figures express their status and power in their dress, body language, and in the trappings of their environmental settings. The absence of those cues in the text environments of cyberspace reduces the impact of their authority. (Suler, 2004, p. 324) • Suler, J. (2004). The online disinhibition effect. CyberPsychology & Behavior, 7(3), 321–326. • Photo from
  16. 16. • Traditionally, society is built on a close relationship between authoritative texts and authority figures • Knowledge linked to power, not only assumes the authority of 'the truth' but has the power to make itself true. All knowledge, once applied in the real world, has effects, and in that sense at least, 'becomes true.' Knowledge, once used to regulate the conduct of others, entails constraint, regulation and the disciplining of practice. (Foucault,1977, p.27) • Foucault, M. (1977). Discipline and punish. London: Tavistock. • Photo from
  17. 17. • Web 2.0 has the power to radically change these knowledge and power relationships – “Wikipedia provokes divisive debates precisely because academics realise that Web 2.0 has the potential to radically transform pedagogic and research practices in higher education – and hence irrevocably change traditional academic power and authority arrangements.” Eijkman (2010, p. 182) • Eijkman, H. (2010). Academics and Wikipedia: Reframing Web 2.0 as a disruptor of traditional academic power-knowledge arrangements. Campus-Wide Information Systems. • Photo from the Opte Project
  18. 18. • How do leaderless networks work? Quote from a book on direct action, about the Occupy Wall Street Movement: – “Before long, people were organizing them everywhere. Someone came up with the theory that the result was a kind of global brain: the interconnections of communication are such that you can imagine people not just communicating but acting, and acting damn effectively, without leadership, a secretariat, without even formal information channels. It's a little like ants meeting in an ant-heap, all waving their antennae at each other, and information just gets around-even though there's no chain of command or even hierarchical information structure. Of course it would be impossible without the Internet.” (Graeber, 2009) • Graeber, D. (2009). Direct Action. An Ethnography. Oakland, CA: AK Press • Photo from Logo-with-Slogan-Perfect-Symmetry-408650529
  19. 19. • Update/2015/0605/OPM-hack-What-criminal-hackers-can- do-with-your-personal-data-video • OPM offering potentially affected individuals • credit report access, • credit monitoring and • identify theft insurance and recovery services • As of yesterday, less than a quarter of the 21m affected had been notified • still-dark-184307823.html
  20. 20. • From as-offensive-counterintelligence/ Offensive Counterintelligence
  21. 21. value of a hacked email account • Image from a-hacked-email-account/ • Phishing: 23% will open message, further 11% will open attachment Verizon, 2015 Data Breach Investigations Report, http://www., 2015
  22. 22. attack lifecycle • From Mandiant’s APT1 report - PT1_Report.pdf
  23. 23. would you click?
  25. 25. Victims of Phishing • Jagatic, Johnson, Jakobsson, & Menczer (2007)
  26. 26. Victims of Phishing • Rocha Flores, Holm, Nohlberg & Ekstedt (2015) – Resistance to phishing: • Intention to resist social engineering • general information security awareness • formal IS training • computer experience
  27. 27. Victims of Phishing • Alsharnouby, Alaca, & Chiasson (2015) – eye tracking testing legitimacy of websites • users successfully detected only 53% of phishing websites – even when primed to identify them • generally spend very little time gazing at security indicators • general technical proficiency does not correlate with improved detection scores.
  28. 28. Victims of fraud • van Wilsem (2011) – large-scale victimization survey data among the Dutch general population (N = 6,201) – those with low self-control run substantially higher victimization risk – as well as active online shoppers and people participating in online forums.
  29. 29. Victims of fraud • Button, Nicholls, Kerr, & Owen (2014) • depth interviews & focus groups with online fraud victims: reasons – the diversity of frauds – small amounts of money sought – authority and legitimacy displayed by scammers – visceral appeals – embarrassing frauds – pressure and coercion – grooming – fraud at a distance and multiple techniques
  30. 30. Victims of fraud • Cross (2013) – discourse surrounding online fraud is heavily premised on idea that victims are both greedy and gullible – need to examine discourse on ‘victim blaming’ in online fraud – current discourse does not take into account the level of deception and the targeting of vulnerability that is employed by the offender in perpetrating this type of crime
  31. 31. Victims of fraud • Cross (2015) – interviews with 85 seniors across Queensland, Australia, who received fraudulent emails, – victim-blaming discourse as an overwhelmingly powerful and controlling discourse about online fraud victimization. – humour reinforces this discourse by isolating victims and impacting on their ability to disclose to those around them. – Identifying and challenging this victim-blaming discourse, as well as the role of humour and its social acceptance, is a first step in the facilitation of victim recovery and future well-being.
  32. 32. Victims of fraud • Cross (2015) How to tackle cyber crime before people even know they’re a victim (The Conversation) – Project Sunbird
  33. 33. Project Sunbird • Identification – police identify people who are sending money to five known high-risk countries • Intervention – Department of Commerce send a letter to each person, notifying them that they may be victims of fraud • Interruption – stoppage of payments and funds • Intelligence – from letter recipients from both agencies • Investigation – police on local and overseas offenders
  34. 34. cybercrime targeting Banks • CEO fraud – ‘business e-mail compromise’ or “Fake President” • Bank malware • Ransomware & extortion • DDOS
  35. 35. CEO fraud • FBI report January 2015 – October 2013 to August 2015 – Combined victims (US & non US): 8,179 – Combined exposed dollar loss: $798,897,959.25 – transfers reported to 72 countries; – majority of transfers to Asian banks located within China and Hong Kong
  36. 36. CEO fraud - Scenario 1. Establish contact – impersonate a group executive (e.g. the president, CEO, CFO) or a trusted partner (e.g. lawyers, notaries, auditors, accountants etc.) o – contact a specific employee, manager, an accounts payable clerk 2. Urgent and exceptional request – request an urgent bank transfer of a large amount to a foreign bank account. 3. Persuasive dialog – Use of authority: It is an order to do this – Secrecy: This project is still secret and its success depends on this transaction – Valorization: I count on you for your efficiency and discretion – Pressure: The success of the project rests on your shoulders 4. Transfer order – Being unsuspicious, transfer will be done manually (using a direct phone call or fax to a bank). – does not follow the standard procedure but may be used by companies in urgent cases or for flexibility reasons as an alternative to the standard procedure.
  37. 37. CEO fraud - Prevention 1. Inform staff that this fraud is ongoing 2. Test staff knowledge of extraordinary transfer procedure 3. Include 2FA in extraordinary transfer procedure 4. Ensure that staff know who CISO is
  38. 38. banking malware • From Kaspersky Lab report Q3 2015 – blocked 625,669 online banking stealing attempts, which is 17.2% lower than in Q2 2015 – Windows 7 x64 Edition accounted for 42.2% of all banking Trojan attacks – 2,516 detected mobile banker Trojans, which is a fourfold increase on the previous quarter
  39. 39. Ransomware and extortion • Increasingly directed at banks – DDOS • take down website (embarrassing) – Exfiltration • will release customer data if not paid
  40. 40. Ransomware and extortion • Hackers Release Swiss Bank Data Over $12K Unpaid Ransom (BloombergBusiness) – small-scale demand -- $12,000 prevalence and ease of a rapidly growing extortion industry that deals in stolen or hijacked data • Hacker who demanded Bitcoin from banks jailed for blackmail and child pornography (TheJournal) – used a phishing program to obtain customer banking details before threatening to release them if he was not paid
  41. 41. Blackmail • “The Ultimate Invasion of Privacy” (Slate) – “How a Chinese hacker used my private nickname, personal emails, and sensitive documents to try to blackmail me.” – US businessman working in China
  42. 42. Ransomware and extortion • Bitcoin cyberextortionists are blackmailing banks, corporations (arstechnica) – intended to harass, extort and ultimately embarrass the victim publicly – between 1 and 100 bitcoins (about £160 to £16,000), a deadline for compliance, and warning of a “small, demonstrative attack.
  43. 43. EU Directives • Network and Information Security • General Data Protection Regulation – Both in the pipeline for some time and still being negotiated • Both have certain issues around mandatory reporting of data breaches, whether to customers or regulators, depending on the size and nature of the organisation • Reputational risk is clearly a very significant factor here
  44. 44. COMBATING CYBERCRIME Strategies for improving internal organisational security
  45. 45. • Emphasis should be on delegation and empowerment of employees – “an autocratic stance inhibits effective information security and highlights ways that this is expressed by experienced Chief Information Security Officers through their use of discourse. They need to develop an identity within the organisation where they are seen to help employees discuss, and make decisions about, information security. The emphasis should be on delegation and empowerment of employees with an acceptance that, as a result, mistakes and errors may occur. (Ashenden & Sasse, 2013) • Photo from in-a-row-1316756
  46. 46. • Select a champion – not necessarily a technical expert – but who can motivate and persuade – “The results of this study give credence to the role of a ‘champion’ within the organization, specifically alluding to the influence this person may have in motivating employees to engage in actions involving IT” (Johnston & Warkentin, 2010a) • Photo from
  47. 47. • “...findings suggest that religiosity and values can play important roles in compliance in the domain of information security... Recognizing and appealing to these beliefs and values can help security managers encourage individuals to be more compliant with the policies set forth by their organization.” (Kelecha & Belanger, 2013) • Photo from knowledge-key-840647/
  48. 48. • appealing to fear does impact intention to comply with infosec, but the impact is not uniform – “....suggest that fear appeals do impact end user behavioral intentions to comply with recommended individual acts of security, but the impact is not uniform across all end users. It is determined in part by perceptions of self-efficacy, response efficacy, threat severity, and social influence.” (Johnston & Warkentin, 2010b) • Photo from security-safety-protection-869216/
  49. 49. COnclusion • You are the weakest link • Your organisation is already compromised – the only question is to what degree • Mandatory reporting is on its way • Ongoing threats require ongoing security • Link information security with human resources
  50. 50. Thank you! www: e: twitter: @cjamcmahon linkedin: @cjamcmahon
  51. 51. Further reading • Alsharnouby, M., Alaca, F., & Chiasson, S. (2015). Why phishing still works: User strategies for combating phishing attacks. International Journal of Human-Computer Studies, 82, 69–82. • Ashenden, D., & Sasse, A. (2013). CISOs and organisational culture: Their own worst enemy? Computers and Security, 39(PART B), 396–405. doi:10.1016/j.cose.2013.09.004 • Button, M., Nicholls, C. M., Kerr, J., & Owen, R. (2014). Online frauds: Learning from victims why they fall for these scams. Australian & New Zealand Journal of Criminology, 47(3), 391–408. • Cross, Cassandra (2013) “Nobody’s holding a gun to your head. . . ” examining current discourses surrounding victims of online fraud. In Richards, Kelly & Tauri, Juan (Eds.) Crime, Justice and Social Democracy : Proceedings of the 2nd International Conference, Crime and Justice Research Centre, Queensland University of Technology, Queensland University of Technology, Brisbane, QLD, pp. 25-32. • Cross, C. (2015). No laughing matter: Blaming the victim of online fraud. International Review of Victimology, 21(2), 187–204. • Jagatic, T. N., Johnson, N. A., Jakobsson, M., & Menczer, F. (2007). Social phishing. Communications of the ACM, 50(10), 94–100. • Johnston, A. C., & Warkentin, M. (2010a). The Influence of Perceived Source Credibility on End User Attitudes and Intentions to Comply with Recommended IT Actions. Journal of Organizational and End User Computing, 22(3), 1–21. doi:10.4018/joeuc.2010070101 • Johnston, A. C., & Warkentin, M. (2010b). Fear Appeals and Information Security Behaviors: an Empirical Study. MIS Quarterly, 34(3), 549–A4. • Kelecha, B., & Belanger, F. (2013). Religiosity and Information Security Policy Compliance. AMCIS 2013 Proceedings. Retrieved from • Parrish, J. L., & San Nicolas-Rocca, T. (2012). Toward Better Decisions With Respect To Is Security: Integrating Mindfulness Into IS Security Training. In pre-ICIS workshop on Information Security and Privacy (SIGSEC) (pp. 12–15). Retrieved from • Rocha Flores, W., Holm, H., Nohlberg, M., & Ekstedt, M. (2015). Investigating personal determinants of phishing and the effect of national culture. Information and Computer Security, 23(2), 178–199. • van Wilsem, J. (2011). “Bought it, but Never Got it” Assessing Risk Factors for Online Consumer Fraud Victimization. European Sociological Review, 29(2), 168–178.