Creating a Diverse
Dr Tyrone W A Grandison
All opinions expressed herein are my own and do not
reflect the opinions of of anyone that I work with (or
have worked with) or any organization that am or have
been affiliated with.
A Little About Me
• BSc Hons Computer Studies, UWI-Mona.
• MSc Software Engineering, UWI-Mona
• PhD Computer Science, Imperial College –
• MBA Finance, IBM Academy
• 10 years leading Quest team at IBM
• 2 years working in startups
• 3 years running companies and consulting
• Now, working for the White House
• Fellow, British Computer Society (BCS)
• Fellow, Healthcare Information and
Management Systems Society (HIMSS)
• Pioneer of the Year (2009), National
Society of Black Engineers (NSBE)
• IEEE Technical Achievement Award
(2010) for “Pioneering Contributions to
Secure and Private Data Management".
• Modern Day Technology Leader (2009),
Minority in Science Trailblazer (2010),
Science Spectrum Trailblazer (2012,
2013). Black Engineer of the Year Award
• IBM Master Inventor
• Distinguished Engineer, Association of
Computing Machinery (ACM)
• Senior Member, Institute of Electrical and
Electronics Engineers (IEEE)
• Over 100 technical papers, over 47 patents
and 2 books.
• Let’s Geek out on Diversity
• A Diverse CyberSecurity Program
• CyberSecurity Fundamentals
• The Current State Of Affairs
• Opportunities In The Space
• Diverse Team
THINK ABOUT THIS
“Because there is no silver bullet in
cybersecurity, no quick fix, we have to solve
problems holistically. We need to deal with
people, process and technology. That means we
need people from diverse backgrounds who
understand and relate to an array of people. And
I’m not just talking about gender and ethnicity.
We also really need right-brain thinkers, left-
brain thinkers, people who can come at these
problems from very different angles.”
- Summer Fowler, Deputy Director, Cybersecurity Solutions Directorate, Computer Emergency R
’s Software Engineering Institute. March 2014
“Diverse group almost always
outperforms the group of the best
by a substantial margin.”
– Scott E. Page (2010)
More On The Importance
• Expands the Qualified Employee Pool
• Improves the Bottom Line
• Enhances Innovation
• Promotes Equality
• Reflects the Customers
Source: NCWIT Scorecard: A Report on the Status of Women in Information Technology
Challenges to DIVERSITY
• Lack of knowledge about cybersecurity
• Lack of awareness of opportunities
• Stereotypical notion
• Unconscious bias
• Lack of exposure to role models and
• Lack of social support
A DIVERSE CYBERSECURITY
–Attack versus Defend
–Compromise versus Detection
–Background, Expertise, Problem-Solving
Perspectives on CyberSecurity
Scope of CyberSecurity
My Definition of CyberSecurity
• Very wide-ranging term
• Everyone has a different perspective
• No standard definition
• A socio-technical systems problem
Scope of CyberSecurity
• Threat and Attack analysis and
• Protection and recovery technologies,
processes and procedures for
individuals, business and government
• Policies, laws and regulation relevant to
the use of computers and the Internet
The field that synthesizes multiple
disciplines, both technical and non-
technical, to create, maintain, and
improve a safe environment.
• The environment normally allows for other more technical or tactical
security activities to happen, particularly at an industry or national
• Traditionally done in the context of government laws, policies,
mandates, and regulations.
SIGNIFICANCE of CyberSecurity
Bureau of Labor Statistics
Difficulties in Defending
of Attack Tools
Menu of Attack Tools
Corporate US Landscape
The Current State of Affairs
Corporate US Landscape
Statistics from the results of an SVB survey about cybersecurity completed by 216 C-level
executives from US-based technology and life science companies in July 2013
• 47% of companies know they have suffered a
cyber attack in the past year
• 70% say they are most vulnerable through their
• 52% rate at “average-to-non-existent” their ability
to detect suspicious activity on these devices
2013 Cyber Security Study - What is the Impact of Today
’s Advanced Cyber Attacks? - Bit9 and iSMG
• First-Generation Security Solutions
Cannot Protect Against Today’s
• There is No Silver Bullet in Security
• There is an Endpoint and Server
2013 Cyber Security Study - What is the Impact of Today’
s Advanced Cyber Attacks? - Bit9 and iSMG
What are the Hard Research Problems?
Where are companies spending their
Where Are The Opportunities?
(TEN Years Ago)
1.Global-Scale Identity Management
3.Availability of Time-Critical Systems
4.Building Scalable Secure Systems
5.Situational Understanding and Attack Attribution
7.Security with Privacy
8.Enterprise-Level Security Metrics
INFOSEC Research Council (2005)
(SIX Years Ago)
1. Global-scale Identity Management
2. Combatting Insider Threats
3. Survivability of Time-critical Systems
4. Scalable Trustworthy Systems
5. Situational Understanding and Attack Attribution
7. Privacy-aware security
8. Enterprise-level metrics
9. System Evaluation Life Cycle
10. Combatting Malware and Botnets
11. Usable Security
INFOSEC Research Council (2009)
2013 Cyber Security Study - What is the Impact of Today’s Advanced Cyber
Attacks? - Bit9 and iSMG
– Income, Sexual Orientation, Religion, Region,
Body type, Dress, Pregnant Disability, Education
level, Introverted or Extroverted, Language,
Vocabulary, Hair color, Body art, Political party,
Diet, Club memberships, Body odors ….
- 2012 Information Technology Workforce Assessment for Cybersecurity (ITWAC)
Summary Report. National Institute for CyberSecurity Education. March 4, 2013.
- 2012 Information Technology Workforce Assessment for Cybersecurity (ITWAC)
Summary Report. National Institute for CyberSecurity Education. March 4, 2013.
– 50% of US workforce
– 25% of IT workforce
– 8-13% of cybersecurity workforce
– 6.4% of IT workforce
– 5% of cybersecurity workforce
• African Americans
– 8.3% of IT workforce
– 7% of cybersecurity workforce
- NIST Panel on Diversity in CyberSecurity, 2013
Women in Cybersecurity
• 13 percent of US CyberSecurity professionals are women —
which is higher than in Europe and Asia (2006 IDC Survey)
"Women are historically very underrepresented in
computer science and in computer security. When I
started in computer security 25 years ago, the field was
20% to 30% women. Now it's between 5% and 10%.
That's obviously going in the wrong direction."
- Jeremy Epstein, board member of Applied Computer Security Associates
UMUC Cyber Team
Army Research Lab Cyber
UTSA Cyber Team
UMBC Center for
Areas of Focus Today
How many times did you
change jobs in my career?
Skills in Demand Today
Skills in demand
in next 2 years
• Define Program Outcomes
• Define Program
– Risk Management
– Critical and Inventive Thinking
– Research and Writing
– Attack & Defense Tool Construction and Use
• Create network
– External Collaborators
• Identify Ways for Increased Visibility
– Of the program, lecturers and students
• Reduce Unconscious Bias
– Start by testing yourself – Project Implicit at Harvard
– Teach Tolerance
– Focus on Hiring a Balanced & Diverse Workforce
– Top-Down, Bottom-Up
– Building Recruiting And Inclusion for Diversity (BRAID)
• Support Network
• Set realistic expectations.
• Provide appropriate time for the training.
• Provide the training in person.
• Be careful in selecting the right facilitator.
Incorporate unconscious bias assessment
• Focus the training on specific, real
situations, such as reviewing resumes,
conducting interviews, responding to
• Address the topic of in-group favoritism and how it
operates in the organization.
• Identify those situations in which our implicit biases
run contrary to our organizations’ explicit values.
• Use proven successful simulations, role-plays, and
other interactive exercises.
• Have groups discuss the words, phrases, symbols,
jokes, and other symbolic representations of their
group that they find offensive and why.
• Provide de-biasing, counter-stereotyping activities
– Such as making associations that go counter to existing
stereotypes (male nurses, female scientists, elderly athletes).
• CyberSecurity is an important field
– Workforce needs
– Growing market
– Job Security
– Significant potential harm
• Diversity in creating a CyberSecurity program is
critical to its success.
– Varied thinkers
– Differing groups and populations
– Balance of strategies and focus area
• The path starts today.
– Look to the University of Technology – Jamaica as a model.
Expands the Qualified Employee Pool
We’re not taking advantage of our diverse population. The industry is failing to attract this talent. Indeed, those women already employed in the technology industry are leaving at staggering rates, so we’re not retaining either.
Improves the Bottom Line
Technology companies with the highest representation of women in their senior management teams showed a higher return on equity than did those with fewer or no women in senior management.
A large study spanning 21 different companies showed that teams with 50:50 gender membership were more experimental and more efficient. Extensive research has found that groups with greater diversity solve complex problems better and faster than do homogenous groups. Culturally diverse teams have been shown to generate a wider variety of possible strategies when setting a course of action.
With technology playing an increasingly crucial role in all of our lives, having more people from different backgrounds participate in its creation can help break down gender and racial economic inequalities.
Reflects the Customers
Most companies serve a variety of people, so it makes sense then to have a variety of intelligent, skilled people working on services and products.
Unconscious bias refers to the biases we have of which we are not in conscious control. These biases occur automatically, triggered by our brain making quick judgments and assessments of people and situations based on our background, cultural environment and our experiences.
Still not sure exactly what negative unconscious biases are? Or maybe you think you don&apos;t have any? I&apos;m talking about the following types of unconscious beliefs that infiltrate our society - and, therefore, corporate America:
• Men are better leaders.• White men are smarter.• African American men are all good athletes.• African American women are &quot;angry.&quot;• White women are great trophy wives.• Women are all on the &quot;mommy track.&quot;• Latino men are lazy.• Latino females are extremely emotional.• Asian men are very good at technology.• Asian females are quiet.• Native American men are drunks.• Native American females are submissive.
Normally includes all aspects of ensuring the protection of citizens, businesses and critical infrastructures from threats that arise from their use of computers and the Internet.
Everyone has a different perspective:
Control Systems Security
Information Risk Management
Even debating whether there’s a “space” between cyber and security
CyberSecurity is often confused with:
Security problems almost always stem from a mix of technical, human and organizational causes (BRUCE SCHNEIER, IAN SOMMERVILLE)
CyberSecurity involves Defense and Attack.
This is my personal definition and the one that will be assumed during the presentation.
Happens at multiple levels in the stack: Network, Application, Data.
Involves Monitoring, Detection, Response.
There are no systems that are connected to the Internet that are completely safe. Cyber-attacks are the norm. Everyone with a web presence is attacked multiple times each week. To further complicate this scenario, government entities have been found to be weakening Web security protocols and compromising business systems in the interest of national security, and hyper-competitive companies have been caught engaging in cyber-espionage. Detection of these attacks in real-time is difficult due to a number of reasons. The primary ones being the dynamism and ingenuity of the attacker and the nature of contemporary real-time attack detection systems. In this talk, I will share insights on an alternative, i.e. quickly recognizing attacks in a short period of time after the incident using audit analysis.
With the rise of the Internet of Things comes a lot of convenience, such as smart fridges that let you access the internet and call for service in the case of malfunction, or devices that can monitor your energy usage and send you Twitter updates.
It also comes with a new problem: many of these internet-connected devices don&apos;t have malware protection. And it&apos;s now been documented that someone is taking advantage. Security company Proofpoint has discovered a botnet attack -- that is, a cyber attack whereby the attacker hijacks devices remotely to send spam -- incorporating over 100,000 devices between 23 December and 6 January, including routers, multimedia centres, televisions and at least one refrigerator.
The attack sent out over 750,000 spam emails, in bursts of 100,000 emails at a time, three times a day, with no more than 10 emails sent from any one IP address, making them difficult to block. Over 25 per cent of the emails were sent from devices that weren&apos;t conventional computers or mobile devices. It is the first documented case of common appliances being used in a cyber attack -- but that doesn&apos;t necessarily mean it was the first time it occurred, and it certainly won&apos;t be the last.
In 2009, Raul Rojas, a computer science professor at the Free University of Berlin (and a robot soccer team coach), built one of Germany’s first “smart homes.” Everything in the house was connected to the Internet so that lights, music, television, heating and cooling could all be turned on and off from afar. Even the stove, oven, and microwave could be turned off with Rojas’s computer, which prevented some potential panic attacks about leaving an appliance on after exiting the house. One of the few things not connected in the house were the locks. Automated locks Rojas bought in 2009 are still sitting in a drawer waiting to be installed. “I was afraid of not being able to open the doors,” Rojas said in a phone interview.
Requires minimal to zero technical skill on the internal operations of attacks. Need only knowledge on use.
Statistics from the results of an SVB survey about cybersecurity completed by 216 C-level executives from US-based technology and life science companies in July 2013
2013 Cyber Security Study - What is the Impact of Today’s Advanced Cyber Attacks? by Bit9 and iSMG
This survey was conducted online during the summer of 2013. Nearly 250 respondents participated in this international study.
Key characteristics of the respondent base:
»» 62 percent are from the U.S., with 10 percent from the UK and Europe;
»» Top responding industries are:
• Banking/financial services – 36 percent
• Technology – 12 percent
• Healthcare – 10 percent
»» 47 percent of respondent organizations employ 500 or fewer employees, while 22 percent employ more than 10,000.
»» 59 percent of respondents deploy only Windows-based endpoints in their organizations, while 1 percent are all-Mac shops. The remainder offer a mix of endpoint devices, with 31 percent saying more PCs than Macs.
First-Generation Security Solutions Cannot Protect Against Today’s Sophisticated Attackers.
It seems like each day there is a new attack reported in the news: advanced attacks such as Flame, Gauss and the Flashback Trojan that attacked 600,000 Macs. These “public” cyber attacks are, unfortunately, just the tip of the iceberg. The number and variety of attackers and their differing goals and motivations are overwhelming. The 2013 Cyber Security Survey shows proof that traditional, signature-based security defenses cannot keep up with today’s advanced threats and malware:
»» 66 percent of survey respondents say their organizations’ ability to protect endpoints and servers from emerging threats for which no signature is known is “average” to “nonexistent.”
»» 40 percent of respondents state that malware that landed on their endpoints and servers got there because it bypassed antivirus.
First-generation security solutions, such as signature-based antivirus, can’t keep up with the tidal wave of widely targeted malware (400+ million variants), let alone advanced attacks that target specific organizations.
There is No Silver Bullet in Security.
Organizations increasingly rely on new-generation network security solutions as a primary defense against cyberthreats. This is a step in the right direction, but not a silver bullet. According to the survey:
»» 27 percent of respondents say malware was able to land on their endpoints and servers because it bypassed network security.
»» 30 percent responded that they don’t know how it got there. The digital assets that you need to protect reside on your endpoints and servers, or are at least accessible from your endpoints and servers, and it is inevitable that some malware is going to make it to this critical infrastructure. How does it happen? It could be that a user fell victim to social engineering, a laptop was disconnected from your network and network security, a user plugged in an infected USB device or mobile phone to his or her PC, or an
advanced threat slipped past your AV.
To combat the APT, you need to fortify your endpoints and servers with security solutions that work together to give you a unified, holistic approach. A defense-in-depth strategy is necessary, where you are not counting on just one security control to stop an attack.
There is an Endpoint and Server Blind Spot
The survey results indicate that there is also an “endpoint and server blind spot.”
»» 59 percent say that when it comes to real-time monitoring of files that attempt to execute on servers and endpoints, their organizations’ abilities rate from “average” to “non-existent.”
»» 61 percent say that once a file is determined to be malicious, the organization’s ability to determine how many endpoints and servers are infected rates from “average” to “nonexistent.”
Global-Scale Identity Management:
Global-scale identification, authentication, access control, authorization, and management of identities and identity information
Mitigation of insider threats in cyber space to an extent comparable to that of mitigation in physical space
Availability of Time-Critical Systems:
Guaranteed availability of information and information services, even in resource-limited, geospatially distributed, on demand (ad hoc) environments
Building Scalable Secure Systems:
Design, construction, verification, and validation of system components and systems ranging from crucial embedded devices to systems composing millions of lines of code
Situational Understanding and Attack Attribution:
Reliable understanding of the status of information systems, including information concerning possible attacks, who or what is responsible for the attack, the extent of the attack, and recommended responses
Ability to track the pedigree of information in very large systems that process petabytes of information
Security with Privacy:
Technical means for improving information security without sacrificing privacy
Enterprise-Level Security Metrics:
Ability to effectively measure the security of large systems with hundreds to millions of users
If you expect a budget increase, where do you believe your organization will prioritize your spending?
Some subsets of cognitive biases with examples (from YorkPsych)
Self Perception Biases
Self Perception biases are the tendency to allow one&apos;s dispositions to affect one&apos;s way of interpreting information. Self perception biases are distortions of one&apos;s own view of self.
1.Bias Blind Spot - the affectation or tendency to be ignorant of one&apos;s own biases. This is a case of the blind not knowing or ignoring that they are blind. (Pronin and Kugler, 2007)
2. Illusion of Control - the belief of being in at least some control over events and outcomes that you actually have no effect on. The devoted fan who gets out his lucky hat that &quot;always brings the game back whenever the Giants are down&quot; is a good example of this bias. (Kahneman and Tversky, 1972)
3. Restraint Bias - having overconfidence in one&apos;s own ability to deny temptations. This is a common bias because people like to believe they can handle whatever faces them and do not want to see themselves as having weak willpower. A Yorkie might fully believe they can become a vegetarian and even spend four days without eating any meat, but when they attend a Carnivores&apos; Club meeting and smell the mouthwatering aroma of bacon, they give into the temptation that they were so confident they would overcome.
4.Self-Serving Bias - the tendency to be less prone to claim a failure than to claim a success. This is mostly due to people thinking their successes were due to their own brilliance, but their errors were caused by mistakes outside of their control ||Cognitive Dissonance||. In Mr. Fink&apos;s titration lab, a student is less likely to claim personal responsibility for the error that ends up skewing some of the results than for his quick thinking that enabled his group to salvage some meaningful data from the experiment.
5. Overconfidence Effect - inappropriately high confidence in one&apos;s own answers, opinions or beliefs. These overestimations could be driven by a strong desire to succeed or could just be a consequence of the general optimism produced by cognitive bias. Examples of overconfidence bias include a famous 1983 study in which 93% of drivers reported that they believed they were among the upper 50% of driving skill. (Pohl, 2006)
6. Egocentric Bias - the tendency of people to claim more responsibility in a group project than actuality. Egocentric bias could be observed if, for instance, any one person claimed to run Fall Fair when in reality, anyone who has taken part in Fall Fair knows it is an enormous team effort. (Kruger, Dunning, 1999)
Perception biases are inaccurate views or conclusions drawn in various ways. They explain certain behavioral vicissitudes as well as how collective debates can result in so many various opinions.
1. Attentional Bias - the tendency for one&apos;s emotions to determine or affect one&apos;s focus. Emotional propaganda plays on this; for instance, certain charity commercials will show pictures of starving kids in Africa to draw attention away from the fact that only a fraction of the money donated actually goes to charitable causes.
2. Availability Heuristic - basing jugements or estimations on what most easily comes to memory. Because we remember cases or events that stand out as unusual or unexpected, this usually results in false assumptions or estimations. (Tversky and Kahneman, 1972) The availability heuristic is hypothesized to be to blame for the misconception that couples are more likely to conceive after they have adopted a child. People tend to remember all of the people who conceive after adoption and tend to forget about all of the cases in which the couples did not conceive after adopting. A more York oriented example is the common belief students seem to have that if their teacher doesn&apos;t show up to class within the first 15 minutes, then they have a free. This fits the availability heuristic because they most easily remember hearing of cases where other students did get away with this and enjoyed an unexpected free, rather then the more plentiful instances where the teacher showed up just in the nick of time and was angry at their attempt to desert class.
3. Hindsight Bias - &quot;the I-knew-it-all-along bias&quot;, it is the tendency to believe you knew something when you truly did not. This also includes viewing completed events as more predictable than they actually were. (Pohl, 2006) Hindsight Bias can easily be observed outside the science building as Yorkies walking out of a math test will ask one another what they got on the Option A and frustratedly proclaim they knew that was what they were supposed to do, but for some reason didn&apos;t apply it at the time.
4. Observer Expectancy Effect / Selective Perception - known as the &quot;observer effect&quot;, this is a fallacy that can very easily skew results in qualitative scientific experimentation. It is the tendency to manipulate or misinterpret data so that it will support (or disprove) a hypothesis. Essentially, it is the tendency to see what you want or expect to see.
5. Framing Effect - the tendency to interpret information differently based on changes in context or description. A Yorkie might exhibit this in the stress they put on studying for a chemistry quiz in comparison to a chemistry test. Even though Ms. Trachsel will explain that test and quiz scores are valued equally, and this quiz will be the same length as an average test, you might still hear one Yorkie telling another that &quot;It&apos;s just a quiz,&quot; implying that being a quiz makes it somehow less imperative or important, regardless of how many points it&apos;s worth.
6. Choice Supportive Bias - the propensity to believe your choices were better or more righteously chosen than they actually were. This tends to happen when an individual remembers only the positive aspects of the chosen option of a decision, and only the negative aspects of the rejected options. For example, a second semester senior who hasn&apos;t taken any AP classes might justify his choice by concentrating on how much stress he would have now had he taken any AP classes, while not thinking about the benefits of passing the AP test and potentially getting college credit.
Logic and Decision Biases
Cognitive biases in logic and decisions are shown mostly through how people go about solving problems in different ways, make various choices, and judge different situations.
1.Base Rate Fallacy - Base Rate Fallacy is the inclination for someone to base his jugements on specifics rather than the big picture. An example of this could be a York Senior who chooses a college for having a strong chemistry program and ignores other aspects such as its location in the middle of a desert.
2.Zero-Risk Bias - the tendency for someone to try to eliminate a small risk rather than lower the likeliness of a great risk. An example of this could be a Yorkie that decides against joining the cross country team because they run on trails adjacent to areas that could contain unexploded ordinance. Rather than always choosing public transportation over driving a car to greatly reduce the risk of death in a transportation accident, the Yorkie reduces a small chance of getting blown to bits. This bias stems from a desire to reduce risk based on proportion rather than by chance. In other words, this Yorkie values a 100% risk decrease from .1% to 0% rather than a 66% risk decrease from, say, 3% to 1%
3. Anchoring - the inclination for someone to allow one piece of information to outweigh others when making a decision. An example might be a couple considering the fact that the girl they hired to babysit their children goes to Stanford to be more important than the side facts that that girl skips half her classes, rides a motorcycle and brings her boyfriend with her to babysitting jobs.
4. Belief Bias - the tendency for someone to ignore logical error in an argument based on how believable a conclusion may be. For instance, people often buy into weight loss commercials that promise you could lose 20 pounds despite the illogical claim that you don&apos;t have to diet and only have to use their method for 10 minutes everyday for two weeks.
5.Semmelweis Reflex - the reflex-like tendency to ignore or reject any contradictory information against what one already believes. An example might be some one who does not believe that high fructose corn syrup is alright for their children after being told it was unhealthy, despite solid research and facts disputing that misconception.
A probability bias arises when someone misinterprets precedents or past information and acts on this inaccuracy.
1. Normalcy Bias - the bias best represented in the freshmen class as Yorkies who are used to flying by in classes believe that since they have never received a B before, it simply cannot or will not happen. This is a logical error based on previous experience that most usually will throw the freshmen into shock. (Hsee and Zhang, 2004)
2. Gambler&apos;s Fallacy - the propensity to believe that happenings of the past determine what will happen in the future. Just as its name predicts, this is most commonly exemplified by gamblers whom mistakenly tend to think along the lines that since they lost their game the last 6 times, they have a much greater chance of winning this time, or the next time, or the time after that. (Hsee and Zhang, 2004)
Predictive biases are most usually related to someone holding the inaccurate belief that they prematurely know information about events or people based on large or general ideas rather than specifics.
1. Optimism Bias - the higher tendency to expect positive outcomes of planned actions, rather than negative. People known as optimists tend to be the reassuring, confidence boosting, Mrs.Sherry-type people who always encourage you to hope for the best.
2. Pessimism Bias - opposite of the Optimism Bias, this is the habit of anticipating negative outcomes rather than positive. Pessimists sometimes suffer from depression, and typically have less hope for success of planned actions.
3. Planning Fallacy - possibly due to deficiencies in the Prefrontal Cortex (Cerebral Cortex), this is the tendency to inaccurately predict the time necessary to complete a task. This can be observed in some York seniors taking AP Psych who underestimate how much time will be needed to complete their textbook-wiki assignment and therefore are up until 2am the night before an installment is due.
4. Stereotyping - a bias in judgement, stereotyping is setting expectations for or drawing conclusions about an individual, based on the group they are tied to. Racial, religious and political stereotyping are most common as one will assume that because someone looks a certain way, believes a certain way or votes a certain way, she is like the majority of all others who affiliate with them.
Conformity biases are the most socially based cognitive biases that are exemplified by people young and old in instances varying from politics to surfing.
1. Availability Cascade - the idea that if you believe something enough, it becomes the truth. This idea is subjective to each individual as, for instance, religious upbringing results in different people having concrete belief in opposing concepts.
2.Ingroup Bias - the tendency for someone to be more comfortable or friendly with people whom he perceive as like himself, or as in the same group as himself. This most basically explains the &quot;cliques&quot; of typical high school as people with common interests gravitate to each other. (Garcia, Song and Tesser, 2010)
3.Out-group Homogeneity Bias - also called homogeneity blindness, this is the tendency for people within a like group to see their group members as more varied and individualistic than the members of other groups.
4. System Justification - the &quot;go with the flow&quot; tendency for people to more frequently adhere to precedents, rather than establish something new or different. As exemplified with political parties vs York clubs, people tend to mold to existing political parties with a general fit to their beliefs/interest rather than establish new, more self-specific parties. Yorkies are less subject to System Justification than most people, as in anyone having a unique interest, such as in surfing, but finding there is no pre-existing group to facilitate that interest, will easily start a surf club. (Edwards, 1968)
~68% of cybersecurity professionals over 40
“In comparison to representative labor statistics—women in 2012 accounted for 46.9% of the United States total labor force and 51.5% of United States management, professional, and related positions—it is clearly evident that women, at just 11% of the Information-Security profession, are greatly under represented.” https://www.isc2cares.org/uploadedFiles/wwwisc2caresorg/Content/Women-in-the-Information-Security-Profession-GISWS-Subreport.pdf
The IT security profession in the United States is heavily white with a disproportionate number of Asians, as compared with the overall workforce, according to an Information Security Media Group analysis of Labor Department employment figures.
43.4% focus on Management/Leadership and Administration
17.8% focus on Engineering
There is job security
Top five includes:
Incident Handling and response
Audit and compliance
Analysis and Intelligence
Cloud computing and Mobile Security play more dominance.
Protect an organization&apos;s critical information and assets by ethically integrating cybersecurity risk management and business continuity best practices throughout an enterprise.
Implement continuous network monitoring and provide real-time security solutions.
Analyze advanced persistent threats and deploy countermeasures and conduct risk and vulnerability assessments of planned and installed information systems.
Participate in forensic analysis of cyber incidents and assist in recovery of operations.
Formulate, update, and communicate short- and long-term organizational cybersecurity strategies and policies.
Set realistic expectations. Do not over promise and under deliver. Raising expectations that unconscious bias training will eliminate all bias would be disingenuous. The goal is to be conscious of our biases and not to pretend to be blind to differences that exist.
Provide appropriate time for the training. It has taken a lifetime to develop our biases; they cannot be overcome in a two-hour session. Ideally, several short sessions or one full day is a minimum.
Provide the training in person. This topic requires interaction, trust, and the opportunity for people to meet in a safe environment. E-learning or Webinars are not appropriate delivery methods for unconscious bias training, nor will they produce measurable change.
Be judicious in selecting the right facilitator. Do not select trainers only because they took a course on diversity, see this topic as “their passion,” or are from an underrepresented group. Trainers should be highly qualified and well versed in the social psychology of attitude formation, be excellent and empathetic facilitators, and have a non-threatening and inclusive style that avoids guilt trips.
Incorporate unconscious bias assessment tools such as those provided by Project Implicit. This tool, which helps to uncover hidden biases on many criteria—including, race, gender, disabilities, and age—has been used more than a million times to uncover hidden biases. Trainers also must know the pitfalls of this test and the way people interpret the outputs from the Project Implicit Website. Trainers must ensure that the trainees are not misinterpreting results and have support as required.
Focus the training on specific, real situations, such as reviewing resumes, conducting interviews, responding to customers etc. An example of an outcome: Asking how to correctly pronounce someone’s name is a micro-affirmation, while not using someone’s name because you are afraid of embarrassing yourself is a micro-inequity.
Address the topic of in-group favoritism and how it operates in the organization. Research shows that a lack of diversity creates “group think,” while diverse viewpoints result in more creativity and innovation.
Identify those situations in which our implicit biases run contrary to our organizations’ explicit values.
Use proven successful simulations, role-plays, and other interactive exercises that help people take the perspective of others. Many standard tools used in diversity training are inappropriate.
Have groups discuss the words, phrases, symbols, jokes, and other symbolic representations of their group that they find offensive and why.
Provide de-biasing, counter-stereotyping activities such as making associations that go counter to existing stereotypes (male nurses, female scientists, elderly athletes).