• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
2008 12 08 2008 Privacy
 

2008 12 08 2008 Privacy

on

  • 1,599 views

A general talk on privacy in early 2009, with quite a few slides summarizing the US National Research Council\'s report "Protecting Individual Privacy in the Struggle Against Terrorists: A ...

A general talk on privacy in early 2009, with quite a few slides summarizing the US National Research Council\'s report "Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment" that was issued in late 2008

Statistics

Views

Total Views
1,599
Views on SlideShare
1,591
Embed Views
8

Actions

Likes
0
Downloads
24
Comments
0

3 Embeds 8

http://www.linkedin.com 4
http://www.slideshare.net 2
http://www.lmodules.com 2

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

2008 12 08 2008 Privacy 2008 12 08 2008 Privacy Presentation Transcript

  • Privacy
    • Lance J. Hoffman
    • Distinguished Research Professor Computer Science Department
    • The George Washington University
    • Washington, DC
    • [email_address]
  • References (others on the cited websites and other sources)
    • Prof. Lorrie Cranor, CMU
    • Robert Belair, Esq., Oldaker, Biden & Belair, Privacy Consulting Group
    • Kenneth Mortensen and Rebecca Richards, DHS Privacy Office
    • Ann Cavoukian, Information & Privacy Commissioner of Ontario
    • Peter J Reid, EDS Chief Privacy Officer
    • Alan F. Westin, Columbia University and Privacy Consulting Group
    • Robert Ellis Smith. 2000. Ben Franklin’s Web Site: Privacy and Curiosity from Plymouth Rock to the Internet. Providence: Privacy Journal.
    • Alan Westin. 1967. Privacy and Freedom . New York: Atheneum.
  • How Likely Are New Federal Privacy Laws?
    • Expectations of the new Congress
      • Democrats expecting to pick up 10 to15 House seats, 5 to 10 Senate seats
      • Presidential race closing but looks like Obama (Oct 16) [lots can happen in 3 weeks]
      • With larger Democratic majorities and pent-up demand for privacy legislation, there is an expectation for heightened activity
      • But , all of the obstacles to enactment of privacy legislation will remain in place
    • Obstacles to enactment of privacy legislation
      • Jurisdictional complications
      • Relatively attractive alternatives
        • State legislation
        • State and federal regulatory action
        • Litigation
        • External pressure – advocacy groups, media, international pressure
        • Self-regulatory codes
    • Privacy topics likely to attract legislative activity
      • Health care reform legislation will require privacy action on electronic health records and, perhaps, HIPAA reform
      • Immigration reform will raise significant ID authentication and related privacy issues
      • Numerous types of financial privacy issues will be likely to receive legislative attention
      • The use and availability of intelligence and surveillance type reports will be likely to receive legislative attention
      • Online privacy, behavioral profiling and social networking
      • Public record data
      • Personal tracking data; video surveillance; GPS; and black boxes
  • Alan Westin’s four states of privacy
    • Solitude
      • individual separated from the group and freed form the observation of other persons
    • Intimacy
      • individual is part of a small unit
    • Anonymity
      • individual in public but still seeks and finds freedom from identification and surveillance
    • Reserve
      • the creation of a psychological barrier against unwanted intrusion - holding back communication
  • Privacy Considerations in the New Information World “About 2004, the Information World Began to Change – in Ten Dimensions” – Alan F. Westin
    • The all-pervasive Internet 2.0
    • “ Identity crisis” and data breaches
    • Social networking and video posting
    • The Blogosphere
    • Behavioral target marketing
    • The mobile revolution
    • Anti-Terrorist Surveillance
    • Monitoring and photographing public spaces
    • Electronic patient health records
    • In the U. S., a growing culture rejecting privacy constraints
  •  
    • “ THE TERM ‘PRIVACY’ CAN BE USEFUL AS A SHORTHAND TO REFER TO A RELATED CLUSTER OF PROBLEMS, BUT BEYOND THIS USE, THE TERM ADDS LITTLE.”
    • A THEORY OF PRIVACY MUST PROVIDE GUIDANCE AS TO PRIVACY’S VALUE. … PRIVACY… DOES NOT HAVE A UNIFORM VALUE. ITS VALUE MUST BE WORKED OUT AS WE BALANCE IT AGAINST OPPOSING INTERESTS.
    • PRIVACY HAS A SOCIAL VALUE AND… ITS IMPORTANCE EMERGES FROM THE BENEFITS IT CONFERS UPON SOCIETY.THE VALUE OF AMELIORATING PRIVACY PROBLEMS LIES IN THE ACTIVITIES THAT PRIVACY PROTECTIONS ENABLE.
          • -- all from Understanding Privacy , Harvard U. Press, 2008
    A more complex view of privacy makes it even more difficult to regulate (or program for); Solove says privacy has no core characteristics, advocates problem-solving approach
  • Federal and state laws and regulations Warning: IANAL
    • Constitutional law governs the rights of individuals with respect to the government
    • Tort law governs disputes between private individuals or other private entities
    • Federal statutes
      • Tend to be narrowly focused or sector-specific
    • State law
      • State constitutions may recognize explicit right to privacy (California, Georgia, Hawaii)
      • 44 U.S. states now have Identity Theft Notification laws
      • New Data Breach and Encryption Laws (Massachusetts requires encrypting sensitive data on laptops and other portable devices (phones?) effective January 2009)
      • Many states have or are considering laws restricting use of SSN
    • Local laws and regulations
      • Some counties are redacting SSNs in online real estate documents, etc.
  • Privacy laws vary around the world
    • US has mostly sector-specific laws, with relatively minimal protections - often referred to as “patchwork quilt”
    • Fair Credit Reporting Act
    • Privacy Act
    • Freedom of Information Act
    • Family Educational Rights and Privacy Act
    • Right to Financial Privacy Act
    • Cable Communications Privacy Act
    • Electronic Communications Privacy Act
    • Video Privacy Protection Act
    • FCC TCPA & CPNI Rules
    • Driver’s Privacy Protection Act
    • Telecommunications Act
    • Children’s Online Privacy Protection Act
    • Wireless Communications and Public Safety Act
    • Gramm Leach Bliley Act
    • Health Insurance Portability & Accountability Act
    • FTC Do Not Call Registry & Telemarketing Rules
    • CAN-SPAM Act
    • Fair & Accurate Credit Transactions Act (FACTA)
    • Some Legislation Passed in Current Session of Congress
      • May 2008: The Genetic Information Nondiscrimination Act of 2008
      • July 2008: Foreign Intelligence Surveillance Amendments Act of 2008
    U.S. Privacy Laws Place Few If Any Restrictions on Trans-Border Data Flow
  • Privacy laws vary around the world
    • European Data Protection Directive requires all European Union countries to adopt similar comprehensive privacy laws that recognize privacy as fundamental human right
      • Privacy commissions in each country (some countries have national and state commissions)
      • Many European companies non-compliant with privacy laws (2002 study found majority of UK web sites non-compliant)
    • Safe harbor: US companies self-certify adherence to requirements; EU reserves right to renegotiate if remedies for EU citizens prove to be inadequate
    Most International Privacy Laws Place Restrictions on Trans-Border Data Flow
  • COMMON ELEMENTS IN PRIVACY FRAMEWORKS http://usacm.acm.org/usacm/Issues/Privacy.htm http://www.ftc.gov/reports/privacy3/ http://www.oecd.org/document/18/0,3343,en_2649_34255_1815186_1_1_1_1,00.html http://aspe.hhs.gov/DATACNCL/1973privacy/tocprefacemembers.htm Accountability (4 recommendations) Recourse and Remedies Enforcement, Accountability, Recourse Accountability and Auditing Security (2 recommendations) Data Security Security, Information Quality, and Integrity Data Quality and Integrity Security Access (3 recommendations), Accuracy (4 recommendations) Data Quality and Access Individual Participation and Access Individual Participation Minimization (5 recommendations), Consent (2 recommendations) Choice and Consent Choice and Consent Minimization Use Limitation Openness (6 recommendations) Notice and Disclosure Notice and Awareness Transparency Purpose Specification USACM Policy Recommendations on Privacy (2006) US FTC Simplified Principles (1998) OECD 1980 US Fair Information Practices (HEW 1973)
  • Privacy policies and related issues
    • Policies let consumers know about site’s privacy practices
    • Consumers can then decide whether or not practices are acceptable, when to opt-in or opt-out, and who to do business with
    • The presence of privacy policies increases consumer trust
    • Policies are often difficult to understand, hard to find, take a long time to read, change without notice
  • Facebook’s Privacy Policy
    • … when printed out is nine pages (3,753 words) long
    • Will someone please tell me what its third-party privacy policy (1,212 words below) means?
    Sharing Your Information with Third Parties Facebook is about sharing information with others — friends and people in your networks — while providing you with privacy settings that restrict other users from accessing your information. We allow you to choose the information you provide to friends and networks through Facebook. Our network architecture and your privacy settings allow you to make informed choices about who has access to your information. We do not provide contact information to third party marketers without your permission. We share your information with third parties only in limited circumstances where we believe such sharing is 1) reasonably necessary to offer the service, 2) legally required or, 3) permitted by you. For example: Your News Feed and Mini-Feed may aggregate the information you provide and make it available to your friends and network members according to your privacy settings. You may set your preferences for your News Feed and Mini-Feed on your Privacy page. Unlike most sites on the Web, Facebook limits access to site information by third party search engine "crawlers" (e.g. Google, Yahoo, MSN, Ask). Facebook takes action to block access by these engines to personal information beyond your name, profile picture, and limited aggregated data about your profile (e.g. number of wall postings). We may provide information to service providers to help us bring you the services we offer. Specifically, we may use third parties to facilitate our business, such as to host the service at a co-location facility for servers, to send out email updates about Facebook, to remove repetitive information from our user lists, to process payments for products or services, to offer an online job application process, or to provide search results or links (including sponsored links). In connection with these offerings and business operations, our service providers may have access to your personal information for use for a limited time in connection with these business activities. Where we utilize third parties for the processing of any personal information, we implement reasonable contractual and technical protections limiting the use of that information to the Facebook-specified purposes. If you, your friends, or members of your network use any third-party applications developed using the Facebook Platform ("Platform Applications"), those Platform Applications may access and share certain information about you with others in accordance with your privacy settings. You may opt-out of any sharing of certain or all information through Platform Applications on the Privacy Settings page. In addition, third party developers who have created and operate Platform Applications ("Platform Developers"), may also have access to your personal information (excluding your contact information) if you permit Platform Applications to access your data. Before allowing any Platform Developer to make any Platform Application available to you, Facebook requires the Platform Developer to enter into an agreement which, among other things, requires them to respect your privacy settings and strictly limits their collection, use, and storage of your information. However, while we have undertaken contractual and technical steps to restrict possible misuse of such information by such Platform Developers, we of course cannot and do not guarantee that all Platform Developers will abide by such agreements. Please note that Facebook does not screen or approve Platform Developers and cannot control how such Platform Developers use any personal information that they may obtain in connection with Platform Applications . In addition, Platform Developers may require you to sign up to their own terms of service, privacy policies or other policies, which may give them additional rights or impose additional obligations on you , so please make sure to review these terms and policies carefully before using any Platform Application. You can report any suspected misuse of information through the Facebook Platform and we will investigate any such claim and take appropriate action against the Platform Developer up to and including terminating their participation in the Facebook Platform and/or other formal legal action. We occasionally provide demonstration accounts that allow non-users a glimpse into the Facebook world. Such accounts have only limited capabilities (e.g., messaging is disabled) and passwords are changed regularly to limit possible misuse. We may be required to disclose user information pursuant to lawful requests, such as subpoenas or court orders, or in compliance with applicable laws. We do not reveal information until we have a good faith belief that an information request by law enforcement or private litigants meets applicable legal standards. Additionally, we may share account or other information when we believe it is necessary to comply with law, to protect our interests or property, to prevent fraud or other illegal activity perpetrated through the Facebook service or using the Facebook name, or to prevent imminent bodily harm. This may include sharing information with other companies, lawyers, agents or government agencies. We let you choose to share information with marketers or electronic commerce providers through sponsored groups or other on-site offers. We may offer stores or provide services jointly with other companies on Facebook. You can tell when another company is involved in any store or service provided on Facebook, and we may share customer information with that company in connection with your use of that store or service. Facebook Beacon is a means of sharing actions you have taken on third party sites, such as when you make a purchase or post a review, with your friends on Facebook. In order to provide you as a Facebook user with clear disclosure of the activity information being collected on third party sites and potentially shared with your friends on Facebook, we collect certain information from that site and present it to you after you have completed an action on that site. You have the choice to have Facebook discard that information, or to share it with your friends. To learn more about the operation of the service, we encourage you to read the tutorial here . To opt out of the service altogether, click here . Like many other websites that interact with third party sites, we may receive some information even if you are logged out from Facebook, or that pertains to non-Facebook users, from those sites in conjunction with the technical operation of the system. In cases where Facebook receives information on users that are not logged in, or on non-Facebook users, we do not attempt to associate it with individual Facebook accounts and will discard it. If the ownership of all or substantially all of the Facebook business, or individual business units owned by Facebook, Inc., were to change, your user information may be transferred to the new owner so the service can continue operations. In any such transfer of information, your user information would remain subject to the promises made in any pre-existing Privacy Policy. When you use Facebook, certain information you post or share with third parties (e.g., a friend or someone in your network), such as personal information, comments, messages, photos, videos, Marketplace listings or other information, may be shared with other users in accordance with the privacy settings you select. All such sharing of information is done at your own risk. Please keep in mind that if you disclose personal information in your profile or when posting comments, messages, photos, videos, Marketplace listings or other items , this information may become publicly available. Privacy policies typically require college-level reading skills to understand Privacy policies often include legalese and obfuscated language
  • “ Short privacy notices” (Hunton & Williams) Reduce privacy policy to at most seven boxes in standard format Privacy advocates prefer check boxes Idea adopted at 2003 International Conference of Data Protection & Privacy Commissioners USG agencies interested for financial privacy notices HOW TO REACH US SCOPE Dated: May 28, 2002 Acme Company Privacy Notice Highlights For more information about our privacy policy, write to: Consumer Department Acme Company 11 Main Street Anywhere, NY 10100 Or go to the privacy statement on our website at acme.com. We collect information directly from you and maintain information on your activity with us, including your visits to our website. We obtain information, such as your credit report and demographic and lifestyle information, from other information providers. PERSONAL INFORMATION We use information about you to manage your account and offer you other products and services we think may interest you. We share information about you with our sister companies to offer you products and services. We share information about you with other companies, like insurance companies, to offer you a wider array of jointly-offered products and services. We share information about you with other companies so they can offer you their products and services. USES You may opt out of receiving promotional information from us and our sharing your contact information with other companies. To exercise your choices, call (800) 123-1234 or click on “choice” at ACME.com. YOUR CHOICES You may request information on your billing and payment activities. IMPORTANT INFORMATION This statement applies to Acme Company and several members of the Acme family of companies.
  • Checkbox proposal WE SHARE [DO NOT SHARE] PERSONAL INFORMATION WITH OTHER WEBSITES OR COMPANIES. Collection: YES NO We collect personal information directly from you   We collect information about you from other sources:   We use cookies on our website   We use web bugs or other invisible collection methods   We install monitoring programs on your computer   Uses: We use information about you to: With Your Without Your Consent Consent Send you advertising mail   Send you electronic mail   Call you on the telephone   Sharing: We allow others to use your information to : With Your Without Your Consent Consent Maintain shared databases about you   Send you advertising mail   Send you electronic mail   Call you on the telephone N/A N/A Access: You can see and correct {ALL, SOME, NONE} of the information we have about you. Choices: You can opt-out of receiving from Us Affiliates Third Parties Advertising mail    Electronic mail    Telemarketing   N/A Retention: We keep your personal data for: { Six Months Three Years Forever} Change: We can change our data use policy {AT ANY TIME, WITH NOTICE TO YOU, ONLY FOR DATA COLLECTED IN THE FUTURE} Source: Robert Gellman, July 3, 2003
  • Towards a privacy “nutrition label”
    • Standardized format
      • People learn where to look for answers to their questions
      • Facilitates side-by-side policy comparisons
    • Standardized language
      • People learn what the terminology means
    • Brief
      • People can get their questions answered quickly
    • Linked to extended view
      • People can drill down and get more details if needed
  • Managing Identity in the Future Much more professional networking
  • Managing Identity in the Future Much more social networking (too much?) Used without asking permission of (that) Lance Hoffman or his friends Hackers' Latest Target: Social Networking Sites By Brian Krebs Washingtonpost.com Staff Writer Saturday, August 9, 2008; D01 LAS VEGAS -- Social networking sites such as Facebook, MySpace and LinkedIn are fast emerging as some of the most fertile grounds for malicious software, identity thieves and online mischief-makers. And while some of the talks given here at Black Hat, one of the larger hacker conferences in the country, would probably make most people want to avoid the sites altogether, it turns out that staying off these networks may not be the safest option, either. … Paradoxically, there may be a danger in remaining a social networking site Luddite. After all, if you don't claim a space on these networks, someone else may do it for you as a way of scamming or attacking your friends and business contacts. With the permission and good humor of security pioneer Marcus Ranum, Hamiel and Moyer created a LinkedIn profile on Ranum's behalf, including a photo of him and bits from his résumé to make the profile look legit. In less than 24 hours, more than 50 people had joined his LinkedIn network. Among those taken in by the stunt was Ranum's sister.
  • Building a System that Manages Identity Landau, Susan and Deirdre Mulligan. “ I’m Pc01002/SpreingPeeper/ED288I.6; Who are you?” IEEE: Security and Privacy 6.2 (March/April 2008): 13-15 Hansen, Marit, Ari Schwartz, and Alissa Cooper. “ Privacy and Identity Management”, IEEE: Security and Privacy 6.2 (March/April 2008): 38-45
    • Determine whether identity is necessary
      • What is the application?
      • What are its uses?
      • What is the larger context?
    • If identity is necessary,
      • consider identity risks
        • What can go wrong with the system, or what are the initiators or initiating events (undesirable starting events) that lead to adverse consequences)?
        • What and how severe are the potential problems or the adverse consequences?
        • How likely to occur are these undesirable consequences?
      • Discourage unnecessary linkages -- Ex: separate medical PII from other PII and from non-PII
    • Implement privacy and security during design (“build in, don’t bolt on”)
  • Challenges and Solutions in Identity Management Dhamija, Rachna and Lisa Dusseault. “The Seven Flaws of Identity Management” IEEE: Security and Privacy 6.2 (March/April 2008): 24-29
    • Identity management is not a goal in itself (give users what they want)
    • Users follow the path of least resistance (make it the secure path)
    • Reduce cognitive burden -- Think of how your system will be used in the larger context of other systems. Don’t replace one burden with another.
    • Reduce the number of trust decisions users have to make, since repeated user consent could lead to maximum information disclosure
    • Use mutual authentication (not just user authentication). Assume that your systems and users will be attacked and design your systems with that in mind.
    • Trust must be earned, so be trustworthy
  • Building a System that Manages Identity Adopt Trust-Enhancing Measures
    • Be a Trustworthy Gatekeeper so users will choose you over competition
    • Take advantage of previous work (don’t reinvent the wheel)
    • Ex: Microsoft Privacy Guidelines for Developing Software Products and Services
      • http://www.microsoft.com/downloads/details.aspx?FamilyId =
      • C48CF80F-6E87-48F5-83EC-A18D1AD2FC1F&displaylang=en
      • Scenario 1: Transferring PII to and from the Customer’s System Scenario 2: Storing PII on the Customer’s System Scenario 3: Transferring Anonymous Data from the Customer’s
      • System Scenario 4: Installing Software on a Customer’s System Scenario 5: Deploying a Website Scenario 6: Storing and Processing User Data at the Company Scenario 7: Transferring User Data Outside the Company Scenario 8: Interacting with Children Scenario 9: Server Deployment
  • Building a System that Manages Identity Adopt Trust-Enhancing Measures Privacy is in the Security Development Lifecycle for Computer Software So get to know and work with your security people; suggest using something like the following to build security and privacy together.
  • Privacy Management Insights Ryan West, “The Psychology of Security”, Communications of the ACM 51:4 (April 2008), pp. 34-40
    • Work with, not against, human psychology
    • Users routinely multitask and if bad things have not happened to them in the past tend to not read relevant text (e.g., privacy statements)
    • Possible solutions:
    • Increase user awareness (outreach program)
    • Create alerts and messages that are distinguishable from other messages and have a higher level of importance when seen,
    • These alerts should not get in the way of users’ primary goals (users are often in the middle of a task when the system asks them to make a security and privacy decision that may require diverting their attention to it)
    • Increase awareness that web traffic is being monitored
  • Chief privacy officers
    • Companies are increasingly appointing CPOs to have a central point of contact for privacy concerns
    • Role of CPO varies in each company
      • Draft privacy policy
      • Respond to customer concerns
      • Educate employees about company privacy policy
      • Review new products and services for compliance with privacy policy
      • Develop new initiatives to keep company out front on privacy issue
      • Monitor pending privacy legislation
  • Is this social or professional networking or both, and does it matter, and if so, why? Used with permission of my friend Harriet Pearson USE CDM I
  • How Are Security and Privacy Different?
    • Authentication
    • Access controls
    • Availability
    • Confidentiality
    • Integrity
    • Retention
    • Storage
    • Backup
    • Incident response
    • Recovery
    Protection Mechanisms Many Privacy Laws Also Restrict Trans-Border Data Flow of Personal Information Attribution?????? Security Privacy
    • “ Individual Rights”
    • Fairness of Use
    • Notice
    • Choice
    • Access
    • Accountability
    • Security
    Personal Information- Handling Mechanisms
  • Technical Controls for Security and Privacy -- Authentication
    • Something you know
      • Passwords – traditional
      • Passwords – federated
      • Passphrases, images, challenge responses
    • Something you have
      • Physical (machine-readable) token
    • Something you are
    Source: Wikipedia At Walt Disney World biometric measurements are taken from the fingers of guests to ensure that the person's ticket is used by the same person from day to day
  • Technical Controls for Security and Privacy Authentication -- Biometrics Wayman, James L. “Biometrics in Identity Management Systems” IEEE: Security and Privacy 6.2 (March/April 2008): 30-37.
    • Biometrics reduce need for other identifiers
    • Still have to safeguard the data representing the biometrics
    • Helpful when used in conjunction with other items
      • (“multi-factor authentication”)
    • Example
      • Something you know
      • Something you have
      • Your location
      • Something you are
    Multi-factor Authentication (not identification)
  • Managing Identity in the Future What about privacy in third party applications*?
    • Defaults are typically set to encourage a lot of sharing
    • There are app-checkers for privacy written by unknown suppliers, since third party application defaults may allow (and even coerce) even more sharing
    • But even these app-checkers may have permissive defaults
    • And, of course, it’s easy to share these, with their permissive defaults, with your “friends”
    “ Third party applications” often involve “generative systems” (Jonathan Zittrain, The Future of the Internet and How to Stop It , Yale University Press, 2008)
  • Identity Management in the Future: More Dynamic Markets Fitzgerald, Michael “Predicting Where You’ll Go and What You’ll Like”, The New York Times 22 June 2008 Acquisti, Alessandro. “Identity Management, Privacy and Price Discrimination” , IEEE: Security and Privacy 6.2 (March/April 2008): 46-50
        • Tracking peoples’ daily whereabouts using GPS information gathered from cell phones
          • Helps predict economic trends and gives business insight on where to open stores and have sales
          • Can empower buyers to set prices (eBay or by walking by [GPS triggers “opportunity alert” to store and to buyer])
          • Sellers can differentiate using anonymous credentials that allow merchant to price to a particular segment of consumer population (but “Soak the rich” algorithm will require high-valued customers (or their electronic agents) to be smart enough to recognize and deal with adverse price discrimination)
  • Government Surveillance of Citizens, Residents, Everyone?
    • Kafka, The Trial
    • Orwell, 1984
    • Maryland dissidents
    • Skype China room
    • SF ATT Room
    • Recent NAS report
  • Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment National Research Council, October 2008 http://www.nap.edu/catalog.php?record_id=12452
    • Address the challenges of technology for countering
    • terrorists, especially
          • Data mining and information fusion,
          • Available and emerging surveillance technologies and their IT support,
          • Behavioral surveillance,
          • Attendant privacy issues
    • Address “ever-present tension”
      • Protection of our Nation or Privacy and Civil Liberties
      • Protection of our Nation and Privacy and Civil Liberties
      • Committee view:
      • Sometimes “or”, sometimes “and”
  • Basic Premises
    • The United States faces two real and serious threats from terrorists.
      • Terrorist acts themselves, and
      • Inappropriate or disproportionate responses to them.
    • The terrorist threat does not justify government activities or operations that contravene existing law.
    • Terrorist challenges do not warrant fundamental changes in our level of privacy protection.
    • Science and technologies are important dimensions of counterterrorism efforts.
    • Counterterrorist programs should provide other benefits when possible.
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • In short…
    • We want the counter-terrorism community to have the best possible tools.
      • With realistic assessment of capabilities and effectiveness.
    • We want our privacy protected.
      • Through oversight, assessment, common sense, lawfulness, and continual improvement
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • The Core of the Report
    • A Framework for Evaluating Information-Based Programs for
      • Effectiveness and
      • Consistency with U.S. Laws and Values
    • Applicable to all information-based programs for specific government purposes, such as counterterrorism, both classified and unclassified.
    • Wanted a framework that was:
      • Realistic;
      • Broadly applicable;
      • Consistent with U.S. laws and values;
      • Based on common sense, best practice, and lessons learned; and
      • Leads to continuous improvement and accountability.
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • Framework: Effectiveness Programs should have or be:
    • Clearly stated purpose-what are you trying to achieve?
    • Rational Basis—why should we even think it might work?
    • Sound Experimental Basis—is there empirical demonstration that it can work?
    • Scalable—will it work at scale?
    • Operations or Business Processes—how does the program work within itself?
    • Capable of being integrated with other inter- and intra-organizational entities—how does it interact with other elements?
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • Framework: Effectiveness Programs must have or be:
    • Robust-is it resistant to countermeasures?
    • Appropriate and Reliable Data—is the data good?
    • Data Stewardship-is the data protected properly?
    • Objectivity-who evaluates the program? (not program advocates!)
    • Ongoing Assessment—programs evolve, and evolved version requires examination as well
    • Documented—are effectiveness and compliance documented? Or merely asserted?
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • Framework: Consistent with U.S. Laws and Values
    • Data
      • Need—why is personal data needed?
      • Sources—where does data come from? Is it legal?
      • Appropriateness—are data good for the intended use?
      • Third-Party Data require additional protections
        • Repurposed data should be explicitly repurposed
        • Leave 3 rd party data in place if possible
        • Consider adequacy explicitly
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • Framework: Consistent with U.S. Laws and Values
    • Programs
      • Objective of program - clear and lawful?
      • Compliance with existing law?
      • Effectiveness – scientifically demonstrated to be effective?
      • Frequency of false positives – acceptable?
      • Reporting and redress of false positives – how to report? How to correct?
      • Impact on individuals – what happens to individuals?
      • Data minimization – are data in excess of what is necessary collected?
      • Audit Trail – can users of the data be held individually accountable for abuse or non-compliance?
      • Security and access – are unauthorized users kept out?
      • Transparency – are the impacts and operation of the program understood by those affected by it?
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • Framework: Consistent with U.S. Laws and Values
    • Administration and Oversight
      • Training – are users properly trained to use the program?
      • Agency Authorization – is the program actually authorized by the agency?
      • External Authorization – are mechanisms for obtaining external authorization in place when necessary?
      • Auditing for Compliance – is compliance reviewed at least annually?
      • Privacy Officer – is a policy-level officer in place to manage privacy issues?
      • Reporting – are all relevant policy makers kept informed and up to date about program operation?
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • Conclusions: Privacy
    • Privacy protection can be obtained through the use of a mix of technical and procedural mechanisms.
    • Data quality is a major issue in the protection of privacy.
    • Inferences about intent and/or state of mind implicate privacy issues to a much greater degree than assessments or determinations of capability.
  • Conclusions: Assessment of Counterterrorism Programs
    • Program deployment and use must be based on criteria more demanding than “it’s better than doing nothing.”
  • Conclusions: Data Mining
    • Currently, privacy violations arising from information-based programs using data mining and record linkage are not adequately addressed.
    • Data mining has been successful in private sector applications such as fraud detection. However, detecting and preempting terror attacks is vastly more difficult.
  • Conclusions: Data Mining, Cont’d
    • Pattern-based data mining can help analysts determine how to deploy scarce investigative resources, and actions. Automated terrorist identification is not feasible.
  • Conclusions: Data Mining, Cont’d
    • Systems that support analysts should have features that enhance privacy protection; however, privacy-preserving examination of individually identifiable records is not possible.
    • Data mining R&D using real population data is inherently privacy-invasive.
  • Conclusions: Deception Detection and Behavioral Surveillance
    • Behavioral and physiological monitoring techniques might help detect: (a) individuals whose behavior and physiological states deviate from norms and (b) patterns of activity with well-established links to underlying psychological states.
    • R&D aimed at automated, remote, and rapid assessment of anomalous behavioral and activity with well-established links to psychological states relevant to terrorist intent is warranted.
  • Conclusions: Deception Detection and Behavioral Surveillance
    • Technologies and techniques for behavioral observation have enormous potential for violating privacy.
  • Recommendation 1
    • Government agencies using information-based programs for counter-terrorist purposes should follow a systematic process such as the one described in the committee’s framework to evaluate the desirability and feasibility of any given program before such a program is set into motion.
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • Sub-Recommendations specify:
    • Periodic application of Framework after deployment
    • Use of synthetic population data for R&D
    • Robust, independent oversight of programs and
    • Redress for innocent individuals harmed by programs.
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • Recommendation 2
    • The U.S. government should periodically review the nation’s law, policy, and procedures that protect the private information of individuals in light of changing technologies and circumstances. In particular, the U.S. Congress should re-examine existing law to consider how privacy should be protected in the context of information-based programs (e.g., data mining) for counterterrorist purposes.
    Protecting Individual Privacy in the Struggle Against Terrorists: A Framework for Program Assessment
  • Privacy
    • Lance J. Hoffman
    • Distinguished Research Professor Computer Science Department
    • The George Washington University
    • Washington, DC
    • [email_address]
    • THE END