INFOSEC FRAMEWORKS FOR
MISINFORMATION
SARA “SJ” TERP AND PABLO BREUER
CANSECWEST 2019
TALK OBJECTIVES
• Describe the problem
• Establish a common language
• Introduce a framework
• Talk about what we can do with the framework
Describing the Problem
Misinformation
SOCIAL ENGINEERING AT SCALE
Facebook Group Shares Interactions
Blacktivists 103,767,792 6,182,835
Txrebels 102,950,151 3,453,143
MuslimAmerica 71,355,895 2,128,875
Patriototus 51,139,860 4,438,745
Secured.Borders 5,600,136 1,592,771
Lgbtun 5,187,494 1,262,386
INTENT TO DECEIVE
Force adversary to make decision or take action based on information that I:
• Hide
• Give
• Change (or change the context on)
• Deny/degrade
• Destroy
Enable my decisions based upon knowing yours
“Operations to convey selected information and indicators to audiences to
influence their emotions, motives, and objectives reasoning, and ultimately the
behavior of governments, organizations, groups, and individuals”
ARTEFACTS
ARTEFACTS
Describing the problem
Why misinformation is different now
INSTRUMENTS OF NATIONAL POWER
…and how to influence other nation-states.
Diplomatic Informational Military Economic
Resources available in pursuit of national objectives…
NATIONSTATE MISINFORMATION
From To
Brazil Brazil
China China, Taiwan, US
Iran India, Pakistan
Russia Armenia, France, Germany, Netherlands, Philippines,
Serbia, UK, USA, Ukraine, World
Saudi Qatar
Unknown France, Germany, USA
MISINFORMATION STRATEGIES
Distort
Distract
Divide
Dismay
Dismiss
WHAT’S DIFFERENT NOW?
OTHER ACTORS AND THEIR MOTIVATIONS
• State and non-state actors
• Entrepreneurs
• Grassroots groups
• Private influencers
RESPONSE: NOT JUST ADMIRING THE PROBLEM
MISINFORMATION PYRAMID
MISINFOSEC:
MISINFORMATION +
INFOSEC
All cyberspace operations are
based on influence.
- Pablo Breuer
MISINFORMATION VIEWED AS…
• Information security (Gordon, Grugq, Rogers)
• Information operations / influence operations (Lin)
• A form of conflict (Singer, Gerasimov)
• [A social problem]
• [News source pollution]
ATTACK. DEFEND. NETWORKS. LOOKED FAMILIAR.
MAYBE THERE WERE THINGS WE COULD USE
ADDING MISINFORMATION TO INFOSEC
“Prevention of damage to, protection of, and restoration of computers,
electronic communications systems, electronic communications services, wire
communication, and electronic communication, including information contained
therein, to ensure its availability, integrity, authentication, confidentiality, and
nonrepudiation” - NSPD-54
INFOSEC ALREADY INCLUDES COGNITIVE
PSYOPS AND INFOSEC AREN’T JOINED UP
Information
Operations
PSYOPS
Computer
Network
Operations
INFOSEC SUPPORT TO MISINFORMATION TRACKING
THERE’S NO COMMON LANGUAGE
“We use misinformation attack (and misinformation campaign) to refer to the
deliberate promotion of false, misleading or mis-attributed information. Whilst
these attacks occur in many venues (print, radio, etc), we focus on the creation,
propagation and consumption of misinformation online. We are especially
interested in misinformation designed to change beliefs in a large number of
people.”
MISINFOSEC COMMUNITIES
● Industry
● Academia
● Media
● Community
● Government
FIRST OUTPUT: MISINFOSEC FRAMEWORK STANDARDS
FRAMEWORKS
Underpinning
misinformation
STAGE-BASED MODELS ARE USEFUL
RECON WEAPONIZE DELIVER EXPLOIT CONTROL EXECUTE MAINTAIN
Persistence
Privilege
Escalation
Defense
Evasion
Credential
Access
Discovery
Lateral
Movement
Execution Collection Exfiltration
Command
and Control
WE CHOSE THE ATT&CK FRAMEWORK
AND STARTED MAPPING MISINFORMATION ONTO IT
Initial
Access
Create
Artefacts
Insert
Theme
Amplify
Message
Command
And Control
Account takeover Steal existing
artefacts
Create fake
emergency
Repeat messaging
with bots
Create fake real-life
events
Create fake group Deepfake Create fake argument
Parody account Buy friends
Deep cover
POPULATING THE FRAMEWORK
• Campaigns
• e.g. Internet Research Agency, 2016 US elections
• Incidents
• e.g. Columbia Chemicals
• Failed attempts
• e.g. Russia - France campaigns
HISTORICAL CATALOG
HISTORICAL CATALOG: DATASHEET
• Summary: Early Russian (IRA) “fake news”
stories. Completely fabricated; very short lifespan.
• Actor: probably IRA (source: recordedfuture)
• Timeframe: Sept 11 2014 (1 day)
• Presumed goals: test deployment
• Artefacts: text messages, images, video
• Related attacks: These were all well-produced
fake news stories, promoted on Twitter to
influencers through a single dominant hashtag --
#BPoilspilltsunami, #shockingmurderinatlanta,
• Method:
1. Create messages. e.g. “A powerful explosion heard from
miles away happened at a chemical plant in Centerville,
Louisiana #ColumbianChemicals”
2. Post messages from fake twitter accounts; include handles
of local and global influencers (journalists, media,
politicians, e.g. @senjeffmerkley)
3. Amplify, by repeating messages on twitter via fake twitter
accounts
• Result: limited traction
• Counters: None seen. Fake stories were debunked very
quickly.
FEEDS INTO TECHNIQUES LIST
• Behavior: two groups meeting in same place at
same time
• Intended effect: IRL tension / conflict
• Requirements: access to groups, group trust
• Detection:
• Handling:
• Examples:
Title
Description
Short_Description
Intended_Effect
Behavior
Resources
Victim_Targeting
Exploit_Targets
Related_TTPs
Kill_chain_Phases
Information_Source
Klil_Chains
Handling
THIS IS WHAT A FINISHED FRAMEWORK LOOKS LIKE
FINDING
TECHNIQUES
Tracking incidents and
artefacts
INCIDENT ANALYSIS
Top-down (strategic): info ops
❏ What are misinformation creators
likely to do? What, where, when,
how, who, why?
❏ What do we expect to see?
❏ What responses and impediments
to responses were there?
Bottom-up (tactical): data science
❏Unusual hashtag, trend, topic,
platform activity?
❏Content from ‘known’ trollbots,
8/4chan, r/thedonald,
RussiaToday etc
❏What are trackers getting excited
about today?
Top-down analysis
Means of implementing influence strategies
STRATEGIES
Distort
Distract
Divide
Dismay
Dismiss
DISTORTION TECHNIQUES
• Distort facts: match intended outcome
• Exaggerate: rhetoric & misrepresent facts
• Generate: realistic false artifacts
• Mismatch: links, images, and claims to
change context of information
DISTRACTION TECHNIQUES
• String along: respond to anyone who engages to
waste time
• Play dumb: pretend to be naive, gullible, stupid
• Redirect: draw engagement to your thread
• Dilute: add other accounts to dilute threads
• Threadjack: change narrative in existing thread
DIVISION TECHNIQUES
• Provoke: create conflicts and confusion among community
members
• Dehumanize: demean and denigrate target group
• Hate speech: attack protected characteristics or classes
• Play victim: claim victim status
• Dog-whistle: use coded language to indicate insider status
• Hit and run: attack and delete after short time interval
• Call to arms: make open calls for action
DISMAY TECHNIQUES
• Ad hominem: make personal attacks, insults
& accusations
• Assign threats: name and personalize enemy
• Good old-fashioned tradecraft
DISMISSAL TECHNIQUES
• Last word: respond to hostile commenters
then block them so they can’t reply
• Brigading: coordinate mass attacks or
reporting of targeted accounts or tweets
• Shit list: add target account(s) to insultingly
named list(s)
Bottom-up analysis
Collecting Artefacts to find incidents
MISINFORMATION PYRAMID
RESOURCES
Trollbot lists:
• https://botsentinel.com/
Tools:
• APIs / python libraries / Pandas
• https://github.com/IHJpc2V1cCAK/socint
• https://labsblog.f-secure.com/2018/02/16/searching-twitter-with-twarc/
Existing datasets
• https://github.com/bodacea/misinfolinks
ARTEFACTS: ACCOUNTS
ARTEFACTS: IMAGES
ARTEFACTS: TEXT (WORDS, HASHTAGS, URLS ETC)
ARTEFACTS: DOMAINS
MOVING UP: CONTENT AND CONTEXT ANALYSIS
• Metadata analysis
• Social network analysis
• Text analysis (frequency, sentiment
etc)
• Time series analysis
• Visual inspection (Bokeh, Gephi etc)
• Correlation
• Models, e.g. clustering and
classification
• Narrative analysis
ANALYSIS: BEHAVIOURS
ANALYSIS: RELATIONSHIPS
EXPERT TRACKERS
@katestarbird #digitalsherlocks @josh_emerson
@conspirator0 @r0zetta@fs0c131y
WHY BUILD
FRAMEWORKS?
… what do we do with
them?
COMPONENTWISE UNDERSTANDING AND RESPONSE
• Lingua Franca across communities
• Defend/countermove against reused techniques, identify gaps in attacks
• Assess defence tools & techniques
• Plan for large-scale adaptive threats (hello, Machine Learning!)
• Build an alert structure (e.g. ISAC, US-CERT, Interpol)
WE NEED TO DESIGN AND SHARE RESPONSES
WE NEED TO BUILD COMMUNITIES
● Industry
● Academia
● Media
● Community
● Government
WE NEED INTELLIGENCE SHARING AND COORDINATION
WE NEED FRAMEWORKS
SPECIAL THANK YOUS
THANK YOU
Sara “SJ” Terp
Bodacea Light Industries
sarajterp@gmail.com
@bodaceacat
CDR Pablo C. Breuer
U.S. Special Operations Command / SOFWERX
Pablo.Breuer@sofwerx.org
@Ngree_H0bit
Community
• Parody-based counter-campaigns (e.g. riffs on “Q”)
• SEO-hack misinformation sites
• Dogpile onto misinformation hashtags
• Divert followers (typosquat trolls, spoof messaging etc)
• Identify and engage with affected individuals
• Educate, verify, bring into the light
64
Offense: Potentials for Next
• Algorithms + humans attack algorithms + humans
• Shift from trolls to ‘nudging’ existing human communities
(‘useful idiots’)
• Subtle attacks, e.g. ’low-and-slows’, ‘pop-up’, etc
• Massively multi-channel attacks
• More commercial targets
• A well-established part of hybrid warfare
65
Defence: Potential for next
• Strategic and tactical collaboration
• Trusted third-party sharing on fake news sites / botnets
• Misinformation version of ATT&CK, SANS20 frameworks
• Algorithms + humans counter algorithms + humans
• Thinking the unthinkable
• “Countermeasures and self-defense actions”
66
Non-state
Misinformation
67
Indexing, not Censorship
6

Terp breuer misinfosecframeworks_cansecwest2019

  • 1.
    INFOSEC FRAMEWORKS FOR MISINFORMATION SARA“SJ” TERP AND PABLO BREUER CANSECWEST 2019
  • 2.
    TALK OBJECTIVES • Describethe problem • Establish a common language • Introduce a framework • Talk about what we can do with the framework
  • 3.
  • 4.
    SOCIAL ENGINEERING ATSCALE Facebook Group Shares Interactions Blacktivists 103,767,792 6,182,835 Txrebels 102,950,151 3,453,143 MuslimAmerica 71,355,895 2,128,875 Patriototus 51,139,860 4,438,745 Secured.Borders 5,600,136 1,592,771 Lgbtun 5,187,494 1,262,386
  • 5.
    INTENT TO DECEIVE Forceadversary to make decision or take action based on information that I: • Hide • Give • Change (or change the context on) • Deny/degrade • Destroy Enable my decisions based upon knowing yours “Operations to convey selected information and indicators to audiences to influence their emotions, motives, and objectives reasoning, and ultimately the behavior of governments, organizations, groups, and individuals”
  • 6.
  • 7.
  • 8.
    Describing the problem Whymisinformation is different now
  • 9.
    INSTRUMENTS OF NATIONALPOWER …and how to influence other nation-states. Diplomatic Informational Military Economic Resources available in pursuit of national objectives…
  • 10.
    NATIONSTATE MISINFORMATION From To BrazilBrazil China China, Taiwan, US Iran India, Pakistan Russia Armenia, France, Germany, Netherlands, Philippines, Serbia, UK, USA, Ukraine, World Saudi Qatar Unknown France, Germany, USA
  • 11.
  • 12.
  • 13.
    OTHER ACTORS ANDTHEIR MOTIVATIONS • State and non-state actors • Entrepreneurs • Grassroots groups • Private influencers
  • 14.
    RESPONSE: NOT JUSTADMIRING THE PROBLEM
  • 15.
  • 16.
    MISINFOSEC: MISINFORMATION + INFOSEC All cyberspaceoperations are based on influence. - Pablo Breuer
  • 17.
    MISINFORMATION VIEWED AS… •Information security (Gordon, Grugq, Rogers) • Information operations / influence operations (Lin) • A form of conflict (Singer, Gerasimov) • [A social problem] • [News source pollution]
  • 18.
    ATTACK. DEFEND. NETWORKS.LOOKED FAMILIAR.
  • 19.
    MAYBE THERE WERETHINGS WE COULD USE
  • 20.
    ADDING MISINFORMATION TOINFOSEC “Prevention of damage to, protection of, and restoration of computers, electronic communications systems, electronic communications services, wire communication, and electronic communication, including information contained therein, to ensure its availability, integrity, authentication, confidentiality, and nonrepudiation” - NSPD-54
  • 21.
  • 22.
    PSYOPS AND INFOSECAREN’T JOINED UP Information Operations PSYOPS Computer Network Operations
  • 23.
    INFOSEC SUPPORT TOMISINFORMATION TRACKING
  • 24.
    THERE’S NO COMMONLANGUAGE “We use misinformation attack (and misinformation campaign) to refer to the deliberate promotion of false, misleading or mis-attributed information. Whilst these attacks occur in many venues (print, radio, etc), we focus on the creation, propagation and consumption of misinformation online. We are especially interested in misinformation designed to change beliefs in a large number of people.”
  • 25.
    MISINFOSEC COMMUNITIES ● Industry ●Academia ● Media ● Community ● Government
  • 26.
    FIRST OUTPUT: MISINFOSECFRAMEWORK STANDARDS
  • 27.
  • 28.
    STAGE-BASED MODELS AREUSEFUL RECON WEAPONIZE DELIVER EXPLOIT CONTROL EXECUTE MAINTAIN Persistence Privilege Escalation Defense Evasion Credential Access Discovery Lateral Movement Execution Collection Exfiltration Command and Control
  • 29.
    WE CHOSE THEATT&CK FRAMEWORK
  • 30.
    AND STARTED MAPPINGMISINFORMATION ONTO IT Initial Access Create Artefacts Insert Theme Amplify Message Command And Control Account takeover Steal existing artefacts Create fake emergency Repeat messaging with bots Create fake real-life events Create fake group Deepfake Create fake argument Parody account Buy friends Deep cover
  • 31.
    POPULATING THE FRAMEWORK •Campaigns • e.g. Internet Research Agency, 2016 US elections • Incidents • e.g. Columbia Chemicals • Failed attempts • e.g. Russia - France campaigns
  • 32.
  • 33.
    HISTORICAL CATALOG: DATASHEET •Summary: Early Russian (IRA) “fake news” stories. Completely fabricated; very short lifespan. • Actor: probably IRA (source: recordedfuture) • Timeframe: Sept 11 2014 (1 day) • Presumed goals: test deployment • Artefacts: text messages, images, video • Related attacks: These were all well-produced fake news stories, promoted on Twitter to influencers through a single dominant hashtag -- #BPoilspilltsunami, #shockingmurderinatlanta, • Method: 1. Create messages. e.g. “A powerful explosion heard from miles away happened at a chemical plant in Centerville, Louisiana #ColumbianChemicals” 2. Post messages from fake twitter accounts; include handles of local and global influencers (journalists, media, politicians, e.g. @senjeffmerkley) 3. Amplify, by repeating messages on twitter via fake twitter accounts • Result: limited traction • Counters: None seen. Fake stories were debunked very quickly.
  • 34.
    FEEDS INTO TECHNIQUESLIST • Behavior: two groups meeting in same place at same time • Intended effect: IRL tension / conflict • Requirements: access to groups, group trust • Detection: • Handling: • Examples: Title Description Short_Description Intended_Effect Behavior Resources Victim_Targeting Exploit_Targets Related_TTPs Kill_chain_Phases Information_Source Klil_Chains Handling
  • 35.
    THIS IS WHATA FINISHED FRAMEWORK LOOKS LIKE
  • 36.
  • 37.
    INCIDENT ANALYSIS Top-down (strategic):info ops ❏ What are misinformation creators likely to do? What, where, when, how, who, why? ❏ What do we expect to see? ❏ What responses and impediments to responses were there? Bottom-up (tactical): data science ❏Unusual hashtag, trend, topic, platform activity? ❏Content from ‘known’ trollbots, 8/4chan, r/thedonald, RussiaToday etc ❏What are trackers getting excited about today?
  • 38.
    Top-down analysis Means ofimplementing influence strategies
  • 39.
  • 40.
    DISTORTION TECHNIQUES • Distortfacts: match intended outcome • Exaggerate: rhetoric & misrepresent facts • Generate: realistic false artifacts • Mismatch: links, images, and claims to change context of information
  • 41.
    DISTRACTION TECHNIQUES • Stringalong: respond to anyone who engages to waste time • Play dumb: pretend to be naive, gullible, stupid • Redirect: draw engagement to your thread • Dilute: add other accounts to dilute threads • Threadjack: change narrative in existing thread
  • 42.
    DIVISION TECHNIQUES • Provoke:create conflicts and confusion among community members • Dehumanize: demean and denigrate target group • Hate speech: attack protected characteristics or classes • Play victim: claim victim status • Dog-whistle: use coded language to indicate insider status • Hit and run: attack and delete after short time interval • Call to arms: make open calls for action
  • 43.
    DISMAY TECHNIQUES • Adhominem: make personal attacks, insults & accusations • Assign threats: name and personalize enemy • Good old-fashioned tradecraft
  • 44.
    DISMISSAL TECHNIQUES • Lastword: respond to hostile commenters then block them so they can’t reply • Brigading: coordinate mass attacks or reporting of targeted accounts or tweets • Shit list: add target account(s) to insultingly named list(s)
  • 45.
  • 46.
  • 47.
    RESOURCES Trollbot lists: • https://botsentinel.com/ Tools: •APIs / python libraries / Pandas • https://github.com/IHJpc2V1cCAK/socint • https://labsblog.f-secure.com/2018/02/16/searching-twitter-with-twarc/ Existing datasets • https://github.com/bodacea/misinfolinks
  • 48.
  • 49.
  • 50.
    ARTEFACTS: TEXT (WORDS,HASHTAGS, URLS ETC)
  • 51.
  • 52.
    MOVING UP: CONTENTAND CONTEXT ANALYSIS • Metadata analysis • Social network analysis • Text analysis (frequency, sentiment etc) • Time series analysis • Visual inspection (Bokeh, Gephi etc) • Correlation • Models, e.g. clustering and classification • Narrative analysis
  • 53.
  • 54.
  • 55.
    EXPERT TRACKERS @katestarbird #digitalsherlocks@josh_emerson @conspirator0 @r0zetta@fs0c131y
  • 56.
    WHY BUILD FRAMEWORKS? … whatdo we do with them?
  • 57.
    COMPONENTWISE UNDERSTANDING ANDRESPONSE • Lingua Franca across communities • Defend/countermove against reused techniques, identify gaps in attacks • Assess defence tools & techniques • Plan for large-scale adaptive threats (hello, Machine Learning!) • Build an alert structure (e.g. ISAC, US-CERT, Interpol)
  • 58.
    WE NEED TODESIGN AND SHARE RESPONSES
  • 59.
    WE NEED TOBUILD COMMUNITIES ● Industry ● Academia ● Media ● Community ● Government
  • 60.
    WE NEED INTELLIGENCESHARING AND COORDINATION
  • 61.
  • 62.
  • 63.
    THANK YOU Sara “SJ”Terp Bodacea Light Industries sarajterp@gmail.com @bodaceacat CDR Pablo C. Breuer U.S. Special Operations Command / SOFWERX Pablo.Breuer@sofwerx.org @Ngree_H0bit
  • 64.
    Community • Parody-based counter-campaigns(e.g. riffs on “Q”) • SEO-hack misinformation sites • Dogpile onto misinformation hashtags • Divert followers (typosquat trolls, spoof messaging etc) • Identify and engage with affected individuals • Educate, verify, bring into the light 64
  • 65.
    Offense: Potentials forNext • Algorithms + humans attack algorithms + humans • Shift from trolls to ‘nudging’ existing human communities (‘useful idiots’) • Subtle attacks, e.g. ’low-and-slows’, ‘pop-up’, etc • Massively multi-channel attacks • More commercial targets • A well-established part of hybrid warfare 65
  • 66.
    Defence: Potential fornext • Strategic and tactical collaboration • Trusted third-party sharing on fake news sites / botnets • Misinformation version of ATT&CK, SANS20 frameworks • Algorithms + humans counter algorithms + humans • Thinking the unthinkable • “Countermeasures and self-defense actions” 66
  • 67.
  • 68.