Threat analysis-perception


Published on

Threat analysis-perception

Published in: News & Politics, Technology
1 Comment
  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • What is a Threat Analysis?
    - list everything you can think of to threaten integrity and/or accuracy of the voting experience
    - analysis: is it plausible? Is it difficult? What damages could occur? Repercussions to follow?
    - countermeasures: what preparations need to be put into place? Plan of action associated with realistic threat?
    - future preparations: what can we do in future elections to avoid such problems?
    NIST (NATIONAL INSTITUTE of STANDARDS And TECHNOLOGY): “to allow the US election community to participate in developing an analysis of threats to voting systems
    - solicit and gather threat analysis and material
    - gather critical analysis of collected threats, plausibility of certain scenarios
    - outline common assumptions made
    - advocate a direction to take in risk management/threat resolution
  • A system of:
    IT: new technologies must be stable, reliable, correct...
    Politics: results must be accurate or chaos would consume the government
    Duty: the government must provide a system in which every citizen can vote—it’s their right
    Trust: citizens must trust the system that the government provides or they won’t use it
    Inclusion: all citizens, no matter who they are, must have the opportunity to vote...introduces many dimensions of interfaces and processes for voting, as well as new holes in security
    Safety: a citizen must not be at risk to vote. Collusion, gangs, threats, terrorism, bioterrorism, bomb threats, financial loss...
    Process: strict, regimented process must be in place.
    Precedence: if such a wide scale system could work, it would raise the bar for many such systems worldwide
  • Identifying attackers: who they are, citizens of this country, on purpose or by accident, what resources they have...
    Identifying goals: what they can gain
  • Steps to creating an AttackTree:
    Identify possible attack goals, each are a separate tree
    Refine tree over time with more perspectives and background research
    Fill in node values: costs, likelihood, resources available, setting of voting day…
    Make security decisions to decide which factor(s) are most important, weigh options
    Create approach:
    Is the system’s goal under attack?
    Is the system extremely vulnerable to a certain type of attack? Like password guessing…?
    Can new group assumptions
    Key security issues: 1024 bit encryption or 2048? Turns out that’s not the issue—Attack tree describes more realistic and feasible attack than decrypting the passwords
    Documentation for historical, legal purposes, easier to train employees for a worst case scenario type thing
    Knowing the vulnerabilities of a system inside and out provides a great knowledge of the system as a whole, all of its components, all of its agents, etc
    Swimming in documentation for every part of the system from start to finish
    Can only react to attacks they can think of...not the unpredictable
    How to put a number or cost on one factor over the other? The fact that one attack plan involves weapons of mass destruction has a higher overall cost to the election results and population as a whole, but may have a likelihood lower than a Trojan Horse imbedded by the developers.
    PICTURE: PGP = pretty good privacy. What if a message had a PGP header? Major branches listed are:
    Decrypt the message itself
    Determine symmetric key used to encrypt the message by other means
    Get recipient to (help) decrypt message
    (RSA = public key algorithm)
  • Doug Jones’ “Threat Catalog”: attempts to document a list of all threats/attacks
    - for each attack, identify vulnerabilities exploited
    - for each vulnerability, identify the defenses in place
    - if (all attacks are !blocked by some defense), ADD DEFENSES
    - the Threat Catalog uses both a Vulnerability Catalog and Defense Catalog
  • 20% of computer vulnerabilities are local - means not over the internet.
    Interconnecting components:
    USER INFO: Registration database in a centralized location -> individual state -> polling precinct location -> poll worker
    VOTE: User -> interface of DRE -> back end of DRE -> physical component connecting DRE to server/tally counter -> tally counter server
    TALLYING: server -> interface of tally counting software -> person who users software
    Each is a different user/agent, very hard to predict
    Polling place access (intimidation, violence)
    Voter manipulation (repeat voting)
    Ballot manipulation prior to tabulation
    Threats to the tabulation process itself
    Threats to the result of the tabulation process
    Trusting the different parts that interconnect…especially the user!
    Systems Engineering from IT perspective: relatively new to weave together all components of a system? Nobody quite knows yet. Requirements gathering in Software engineering is a dynamic science as well, so is design and to determine if a system is complete?
    How to tell if the system was even tainted?
    What to do if it was?? Leak to the press, lose voter confidence: Florida Election. No more Leno jokes.
  • 1. CHAIN VOTING: bad guy gets a blank real ballot OR a counterfeit one OR steals one OR gets an absentee, then subverts a voter by any means necessary, then makes the voter get a new ballot for himself but uses the already completed vote to cast, brings back vote to bad guy. Rinse and Repeat.
    2. VOTES ON A ROLL: on one roll, you can easily see who voted in what order...
    3. DISORIENTED OPTICAL SCANNER: a vote is counted by reading the row and column coordinates from a ballot. Tweak those just slightly and you’ve got a new vote. Easily do this by editing a few numbers in the scanner’s configuration file.
    4. When A Number 2 Pencil Is Not Enough: You can recalibrate a reader to be sensitive about the gradient shading of a bubble/vote. You can discount the ones that are too dark or too light. If you discounted those that are too dark, you could be a poll worker, selectively telling people to make sure they press really hard on the pencil and fill in the entire bubble. Also – smudging/smearing, messy erasers, unidentified substances that are picked up off the ballot...
    5. WHERE DID WE GET THESE POLL WORKERS? Purposefully tainting the election: favoring one party or candidate, wrongfully turning away legitimate voters, wrongfully admitting illegitimate voters, failure to properly administer provisional ballots, failure to give proper instruction to voters that need it, failure to handle spoiled ballots properly. Rates of errors due to poll workers as high as 10% in some precincts.
  • 6. FELL OFF THE DELIVERY TRUCK: full access to those machines will in transit to the precinct. Who is driving the truck? What if they got fake ones somewhere in-between? What happens if just one machine was stolen? What could they do with the data intercepted from an old machine? Quantity of machines to be delivered...!
    7. DISORIENTED TOUCH SCREEN: recalibration of touch-sensor technology is frequent! Example: palm pilots need to be calibrated every now and then, or the stylus’s pinpointing abilities are very inaccurate. With a system that could get hundreds of pokes a day, can we make sure someone will test its calibration? Or could they throw it away?? Or miscalibrate it, like the case of the Disoriented Optical Scanner?
    8. THE CONFUSING BALLOT: maybe too many boxes and arrows and bubbles to keep the candidates straight...or maybe the voter is not competent enough to understand how the ballot is supposed to work...either way, it’s a tactic used to make more residual or erroneous votes in a certain precinct or jurisdiction
    9. THIRD PARTY WHOOPSIES: running the voting software on top of another OS, or through a COTS product (x window manager) with unknown problems, possible injection of code into THAT program that would affect voting system...
    10. XRAY VISION: bad guy uses an electromagnetic emanation detector that comes from DRE and sits in a van outside. Bad guy could intimidate voters before they go in, says “I can see you and your vote, so if you don’t, I’ll get you...” Lots of costly equipment required, but subtle and covert.
  • 11. OOPS CODE: hopefully (!) accidents in development. Swapping the yes/no bubble by accident (California), trial run of one system, votes cast in Spanish were not counted at all, only those in English
    12. SECURE WIRELESS CONNECTIONS: listed a Pringles potato chip can as a “highly effective receiver for wi-fi traffic”...likelihood of detection is very low. WI-FI often built in to new laptops, which is what DRE’s built on...solution: use of Faraday cage13. WHEN HELP ISN’T HELPFUL: addition of a new agent in the process introduces MANY MANY holes in security. Looking over shoulder, intimidating the voter, corresponding to bad guy...disabled voters...
    14. TROJAN HORSE: requires bad guy to be the programmer...or does he?? Wi-fi connection, exposed usb drive...could be in the tally server too!
    15. REPLACEABLE FIRMWARE: could result in a new bootable program, taking over hardware or installing Trojan horse...
  • 16. UNFINISHED VOTE: If a voter walks away, angry at the machine, or goes to answer their cell phone, or runs away to chase their child, or thinks that they are done, that vote is exposed! Anyone can come up behind them and take the vote. The bad guy could very well be a poll worker, who looks like he/she is canceling the vote, but may really casting it.
    17. I THINK I KNEW WHAT THEY MEANT...Trojan that might swap names and candidates or parties and pictures, swaps indices in backend tallying database, consistency with disabled persons’ ballots
    18. GROUP CONSPIRACY: voters from party B go early to vote at a precinct dominated by party A. They register successfully, since that’s their precinct, but no matter how many times they try to verify their ballot, it never comes up to what they want. No one else can look, so election officials have no choice but to remove the machines from service, shutting down the polling place for the day.
    19. TYPO: Too many people regard typos as just that...may trust their vote rather than the verification. Or a misspelled last name, or the wrong digit of a social security number.
    20. DENIAL OF SERVICE ATTACK: too many packets being sent to the tallying server...or is it? Is someone trying to attack the precinct?
  • “Recommendations of the Brennan Center for Justice and the Leadership Conference on Civil Rights for Improving Reliability of Direct Recording Electronic (DRE) Voting Systems”:
    - Tamper Tape: ensures that a system is up-to-date and pure
    - “independent expert security team” who will inspect the system top to bottom. Full access to:
    - hardware/firmware
    - software code
    - procedural protocols
    - design documentation
    - back-end system details
    - copies of all software design documents (and other docs) to aid in navigation through the source code
    - complete documentation on how the source code is converted to object code: compilers, compiler options used, libraries, configuration parameters
    - complete version history: change log
    - outstanding bugs, known vulnerabilities or limitations
    - documentation on tests: type, results, version of code they were ran on
    - program suites – developing environment
    - regression protocols
    RED TEAM: team of analysts who try to attack the system:
    hardware: to avoid attacks that might change critical settings, install malicious drivers, or otherwise tamper with terminals or tally servers, leave exposed drives or insufficient locks...RECOMMENDATION: use of a tamper tape to make sure breaches are DETECTABLE, replacement of hardware is POSSIBLE, and new security procedures to replace hardware flaws will HAPPEN.
    Hardware/firmware configuration assessment: how hardware/firmware components are connected. This includes the ROM, like bootable code...RECOMMENDATIONS: Red Team exercises to make sure of proper locks with unique keys and pwds, make sure network access is not available through modems, Ethernet ports, or other points between hardware components; machines are only bootable off a secure drive, as opposed to a CD or floppy. Use of a tamper tape.
    Software Design: (1) good faith flaws – poor programming practices (pwds or encryption keys not hidden from the everyday user), bad code, (2) malicious code hidden within system – count votes erroneously, purposely leave room for backdoors, record voting or user statistics in an undocumented way...RECOMMENDATIONS: security team – review code with AUTHENTICATION, ENCRYPTION, and ACCESSIBILITY to certain private files in mind
    Software Configuration (ways that software components work together): anti-virus software – presence and up-to-date. RECOMMENDATIONS: expert team analyzes the entire system to see how data flows from one element to another; review patches to system, anti-virus software used in servers and terminals; the procedures for updating software – autoupdates from anti-virus? Rule all remote software upgrades as an unacceptable risk...
    Voting Procedures (not hardware or software...people and process): any procedures used that can facilitate security breaches or machine malfunctions or fail to stop them. Absence of adequate security procedures (using only one encryption key or password for all machines instead of one per machine), poor implementation of adequate procedures for training of poll workers, departures from protocol by unforeseen circumstances. ** Maryland example in this report: RABA investigators found that “all 32,000 of Maryland’s touch screen terminals had the same locks and keys, making every machine accessible to anyone with the keys. The keys could also be easily reproduced at three local hardware stores...”** RECOMMENDATIONS: development of standard operating procedures, respond EARLY to security incidents, alleged or real; these INCREASE CONFIDENCE by “providing factual information to replace rumor, innuendo, fear, uncertainty, and doubt..”
  • Threat analysis-perception

    1. 1. Threat Analysis Lunar Security Services
    2. 2. Overview • • • • • Definitions Representation Challenges “The Unthinkable” Strategies & Recommendations 2
    3. 3. Background • What is threat analysis? – Potential Attacks/Threats/Risks – Analysis – Countermeasures – Future Preparations • NIST’s “Introduction to Threat Analysis Workshop”, October 2005 3
    4. 4. Stakes • People – – – – – – – – Voters Candidates Poll Workers Political Groups Developers Board of Elections Attackers More... • Voting: A System of... – – – – – – – – IT American Politics Duty Trust Inclusion Safety Process Precedence...if it works 4
    5. 5. Means of Representation General tactic: – Identify possible attackers – Identify goals of attacker – Enumerate possible ways to achieve goals – Locate key system vulnerabilities – Create resolution plan 5
    6. 6. Attack Tree • Bruce Sheneier, Dr. Dobb’s Journal, 1999: – Used to “model threats against computer systems” Simple Example Cost propagation Multiple Costs • Continual breaking down of goals and means to achieve them 6
    7. 7. Attack Tree Evaluation • Creation – Refining over time – Realistic costs • Advantages – Identifies key security issues – Documenting plans of attack and likelihood – Knowing the system • Disadvantages – Amount of documentation – Can only ameliorate foreseen circumstances – Difficult to prioritize/quantize factors Shortened version of an Attack Tree for the interception of a message send with a PGP header. 7
    8. 8. Other Means of Representation • Threat Catalog – Doug Jones – Attacks -> vulnerabilities -> analysis of defense – Challenges • • • • Organization Technology Identity Scale of Attack • Fault Tree Analysis – Ensures product performance from software – Attempts to avoid single-point, catastrophic failures 8
    9. 9. Challenges • Vulnerabilities – System – Process • • • • Variety of possible attacks New Field: Systems Engineering Attack Detection Attack Resolution -> too many dimensions to predict all possibilities, but we’ll try to name a few… 9
    10. 10. “The Unthinkable”, Part 1 1. 2. 3. 4. 5. Chain Voting Votes On A Roll The Disoriented Optical Scanner When A Number 2 Pencil Is Not Enough ...we found these poll workers where? 10
    11. 11. “The Unthinkable”, Part 2 6. This DRE “fell off the delivery truck”... 7. The Disoriented Touch Screen 8. The Confusing Ballot (Florida 2000 Election) 9. Third Party “Whoopsies” 10. X-ray vision through walls of precinct 11
    12. 12. “The Unthinkable”, Part 3 11. “Oops” code 12. Do secure wireless connections exist? 13. I’d rather not have your help, thanks... 14. Trojan Horse 15. Replaceable firmware on Optical Scanners Natalie Podrazik – 12
    13. 13. “The Unthinkable”, Part 4 16. Unfinished vote = free vote for somebody else 17. “I think I know what they meant by...” 18. Group Conspiracy: “These machines are broken.” 19. “That’s weird. It’s a typo.” 20. Denial of Service Attack Natalie Podrazik – 13
    14. 14. My Ideas... • Write-in bomb threat, terrorist attack, backdoor code • Swapping of candidate boxes (developers) at last minute on touch-DRE; voters don’t know the difference • Children in the voting booth Natalie Podrazik – 14
    15. 15. Strategies & Recommendations • Create Fault Trees to counter Attack Tree goals using the components set forth in Brennan Study • Tamper Tape • Use of “independent expert security team” – Inspection – Assessment – Full Access • Use of “Red Team Exercises” on: – Hardware design – Hardware/Firmware configuration – Software Design – Software Configuration – Voting Procedures (not hardware or software, but people and process) 15
    16. 16. Conclusions • Attack Trees – Identify agents, scenarios, resources, system-wide flaws • Challenges: dimensions in system analysis • Unforeseen circumstances • Independent Team of Experts, but how expert can they be? 16
    17. 17. Works Cited 1. 2. 3. 4. 5. 6. 7. All 20 “The Unthinkable” scenarios available at: Goldbrick Gallery’s 25 Best Editorial Cartoons of 2004. Online: Jones, Doug. “Threat Taxonomy Overview” slides, from the NIST Threats to Voting Workshop, 7 October 2005. Online: Mell, Peter. “Handling IT System Threat Information” slides, from the NIST Threats to Voting Workshop, 7 October 2005. Online: “Recommendations of the Brennan Center for Justice and the Leadership Conference on Civil Rights for Improving Reliability of Direct Recording Electronic Voting Systems”: endations.pdf: Wack, John, and Skall, Mark. “Introduction to Threat Analysis Workshop” slides, from the NIST Threats to Voting Workshop, 7 October 2005. Online: Wikipedia Entry for fault tree: 17