• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content

Loading…

Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Like this presentation? Why not share!

Thierer Internet Privacy Regulation

on

  • 6,585 views

 

Statistics

Views

Total Views
6,585
Views on SlideShare
1,383
Embed Views
5,202

Actions

Likes
0
Downloads
8
Comments
0

7 Embeds 5,202

http://mercatus.org 5188
http://ppe.mercatus.org 8
http://paper.li 2
http://www.google.com 1
http://grad.mercatus.org 1
http://translate.googleusercontent.com 1
http://www.mercatus.org 1
More...

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Thierer Internet Privacy Regulation Thierer Internet Privacy Regulation Presentation Transcript

    • Privacy & The Internet: An Overview of Key Issues
      Adam Thierer
      Senior Research Fellow
      Mercatus Center at George Mason University
      May 19, 2011
    • Outline of Presentation
      What do we mean by “privacy?
      Different approaches to defining / protecting it
      Trade-offs associated with privacy regulation
      The challenge of information control
      Specific regulatory proposals
      An alternative vision / the “3-E Solution”
      2
    • What is Privacy?
      Privacy is a remarkably vague concept
      Means different things to different people
      Varies by cultures
      An ever-changing concept
      Reacts to evolving social norms & technological change
      If it is a “right,” we must determine how it plays alongside other, well-established rights (ex: freedom of speech & press freedoms)
      3
    • Privacy’s Fuzzy Concepts
      “Harm”
      How do we define and measure “harm”?
      Is “creepiness” a harm?
      Should “emotional harms” (feelings) be actionable?
      “Ownership”
      Who owns shared data?
      What is personally identifying information?
      “Informed Consent”
      Are strict contracts possible?
      “Sensitive Data”
      Health, financial, what else?
      4
    • Alan Westin’s 3 Visions / Paradigms
      “Privacy Fundamentalists”: Absolutists about privacy being a “right” & one that trumps most other values / considerations
      “Privacy Pragmatists”: Values privacy to some extent but also sees benefits of information sharing
      “Privacy Unconcerned”: Have little concern about who knows what about them
      5
    • How to Enforce / Protect Privacy?(U.S. vs. E.U. Visions)
      United States
      Privacy not viewed as a fundamental right
      Issue-specific / Sectoral approach
      Bottom-up case law / torts
      States have role; often more stringent than fed law
      More focus on “opt-out”
      “Big Brother” generally = govt
      = a reactive regime
      European Union
      Privacy viewed as a fundamental “dignity” right
      Broad-based approach
      Top-down “directives”
      More focus on “opt-in”
      “Big Brother” = private sector as much as govt
      = a preemptive regime
      6
    • The U.S. Sectoral / Issue-Specific Approach to Privacy Law
      Privacy Act (1974) = govt data collection
      FERPA (1974) = fed-funded education institutions
      Cable Comm. Policy Act (1984) = cable data
      Video Privacy Prot. Act (1988) = video rental records
      Driver’s Privacy Prot. Act (1994) = DMV records
      HIPPA (1996) = health records
      Gramm-Leach-Bliley (1999) = financial records
      COPPA (1998) = kids’ (under 13) online privacy
      CAN-SPAM Act (1993)
      Do Not Call registry (2003)
      7
    • The Battle over Online Privacy
      Policy battle has been raging since late 1990s
      FTC & Congress appeared poised to act around 2000, but...
      Industry self-regulation was given a chance
      9/11 preempted this debate to some extent
      Framework for past decade:
      Focus on Notice / Choice / Access / Security
      Rise of self-regulatory bodies & mechanisms
      Targeted FTC & state enforcement
      8
    • New Fault Lines in the Online Privacy Wars (and the legislative response)
      New activity driven by:
      Fears of “targeting” & “tracking” = “creepy” factor
      General unease with ubiquity of data access & availability
      Proposals:
      “Baseline legislation” / FIPPS (Kerry-McCain, Rush, Stearns)
      “Do Not Track” mechanism + regulation (Speier & Rockefeller bills)
      “Do Not Track Kids” / COPPA expansion (Markey-Barton)
      Internet “Eraser Button” (Markey-Barton)
      Geolocation restrictions (Markey-Barton)
      Data breach disclosure (Kerry-McCain)
      Data minimization requirements (Kerry-McCain, Rush)
      ECPA vs. Data retention laws
      9
    • Privacy Trade-Offs & Opportunity Costs
      Internet feels like the ultimate “free lunch;” most sites, services & content are free of charge.
      But, in reality, there is no free lunch.
      The implicit quid pro quo of online life: you gotta give a little to get a little (or a lot!). And most people like this deal.
      The Net is powered by advertising & data collection. Information is lifeblood of Digital Economy.
      Info may be collected to facilitate a better browsing experience or to help the site or service remain viable.
      In essence, information used in lieu of payment.
      Regulation could break this system & have other unintended consequences.
      10
    • The Problem of Information Control
      Even if we agree privacy is important and worth protecting, it will be very hard.
      “Information wants to be free” - Stewart Brand
      and that includes personal information
      “The Net interprets censorship as damage and routes around it.” - John Gilmore
      and privacy regulation is, at root, a form of data flow censorship
      11
    • 10 Factors That Complicate Information Control Efforts
      12
    • Some Facts (or ‘Why Putting Genies Back in Bottles is So Hard’)
      Facebook: users submit @ 650,000 comments on the 100 million pieces of content served up every minute on its site.
      YouTube: over 35 hours of video uploaded every minute.
      Twitter: 300 million users produce 140 million Tweets / day, = a billion Tweets every 8 days. (@ 1,600 per second)
      Apple: more than three billion apps have been downloaded from its App Store by customers in over 77 countries.
      “Humankind shared 65 exabytes of information in 2007, the equivalent of every person in the world sending out the contents of six newspapers every day.” - Hilbert and Lopez
      13
    • “The Privacy Paradox”
      “People value their privacy, but then go out of their way to give it up.” – Larry Downes, Laws of Disruption
      “We give away information about ourselves—voluntarily leave visible footprints of our daily lives—because we judge, perhaps without thinking about it very much, that the benefits outweigh the costs. To be sure, the benefits are many.” – Abelson, Ledeen & Lewis, Blown to Bits
      14
    • What We Must Learn to Accept
      “Once information is out there, it is very hard to keep track of who has it and what he has done with it.” --David Friedman, Future Imperfect
      Privacy is not “dead” as some have claimed, but it is different than it was in past
      New realities of info dissemination, accessibility, searchability
      Rushed, heavy-handed solutions will be costly and perhaps not effective anyway
      15
    • Policy Responses(and their problems)
    • “Do Not Track” – The Theory
      Could be voluntary, but might be mandated.
      Would demand that websites honor a machine-readable header indicating that the user did not want to be “tracked.”
      In theory, this will allow privacy-sensitive web surfers to signal to websites they would like to opt-out of any targeted advertising, or not have any information about them collected when visiting sites.
      17
    • “Do Not Track” – Potential Downsides
      Costs: If law breaks the quid pro quo something must give…
      Paywalls and higher prices?
      less relevant or more intrusive advertising?
      Fewer services? Less media content?
      Int’l Competitiveness: Goldfarb & Tucker - “after the [EU’s] Privacy Directive was passed [in 2002], advertising effectiveness decreased on average by around 65 % in Europe.” Because regulation decreases ad effectiveness, “this may change the number and types of businesses sustained by the advertising-supporting Internet.”
      Practical? Does DNT scale? Apply internationally? To other devices?
      Regulatory creep: Will it serve as a template for other forms of Net regulation?
      18
    • COPPA Expansion – Background
      Special concerns about youth & online marketing
      COPPA (‘98) was first attempt to deal with it
      Requires “verifiable parental consent” for sites “directed at” children that collect info
      FTC defines rules (safe harbors) and enforces
      Never constitutionally challenged
      19
    • COPPA Expansion – Potential Problems
      What works for under 13 not likely to work for teens
      Would basically require mandatory age verification of all web surfers
      COPPA becomes COPA? = unconstitutional
      Serious free speech issues
      Irony = in name of protecting privacy, more info about users would need to be collected!
      20
    • Internet “Eraser Button” Concept
      Goal: Make it easier for people (esp. kids) to delete posted comments or content they later regret
      Practical Problem: Where is this button? Who controls it? What if info is shared content? Back-door to fraud / abuse?
      Principled Problem: Conflicts mightily with freedom of speech & press freedoms
      21
    • A Different Visionfor Privacy Protection
    • The Conflict of Visions:Anticipatory Regulation vs. Resiliency
      Long-standing conflict of visions about how to best manage risks:
      Anticipation
      Prevention is prime value
      Focus on the “Precautionary Principle”
      Resiliency
      Experimentation is prime value
      Focus on Learning / Coping
      23
    • Anticipatory vs. Resiliency-Based Solutions
      Anticipatory Reg Approach
      Mandatory “Do Not Track”
      Mandatory “Opt-In” for all data collection
      Bans on apps / functionality
      Restrictions on sharing / all defaults to private
      “Eraser Button” mandates / demands for data deletion
      Resiliency Approach
      Voluntary “Do Not Track”
      Offer opt-outs (encourages experimentation & innovation)
      No preemptive bans on tech
      No restrictions on sharing, but education about downsides
      Voluntary data “purges” & “data hygiene”
      24
    • Constructive Alternatives to Regulation
      Be careful @ how “harm” & “market failure” defined. (ex: Creepiness not a likely harm; data breech likely a harm)
      Focus on a “3-E Solution” to problems: Education, Empowerment, & (Targeted) Enforcement
      Encourage corporate and personal responsibility
      Think of privacy as an evolving set of norms, interactions & experiments
      Don’t Panic! We can learn to cope with technological change.
      25
    • 26
      The “3-E Solution”
    • #1: Educational Solutions
      Education at all levels
      Awareness campaigns from privacy advocates, govt, industry, educators, etc.
      Encouraging better online “netiquette” and “data hygiene”
      Push for better transparency across the board
      Better notice & labeling
      Need more watch-dogging of privacy promises made by companies
      27
    • #2: Empowerment Solutions
      = Helping users help themselves
      User “self-help” tools are multiplying
      AdBlockPlus, NoScript, other browser tools
      Industry self-regulation
      More cross-industry collaboration on privacy programs
      More education efforts (better notice)
      Best practices & better defaults
      More and better tools to respond to new developments and needs
      28
    • #3: Enforcement Solutions
      Holding companies to the promises they make
      stepped-up FTC Sec. 5 enforcement
      Demand better notice & transparency
      Mandatory disclosure of data breaches
      Targeted regulation of sensitive data, but with flexibility
      29
    • Conclusion / Key Takeaways
      “Privacy” is incredibly complicated & contentious
      Privacy can conflict with other values / rights
      All regulation entails costs & trade-offs
      There is no free lunch
      Information control is very, very hard
      “Silver-bullet” solutions rarely work
      The more education & transparency the better
      Resiliency is generally a smarter strategy compared to anticipatory, top-down regulation
      And, once more… don’t panic! We’ll get through and adjust.
      30
    • Further Readings
      Adam Thierer, Filing to Federal Trade Commission in ‘Do Not Track’ Proceeding, February 18, 2011.
      Adam Thierer, “Birth of the ‘Privacy Tax,’” Forbes, April 4, 2011.
      Adam Thierer, “Online Privacy Regulation: Likely More Complicated (And Costly) Than Imagined,” Mercatus on Policy, Mercatus Center at George Mason University, December 6, 2010 .
      Adam Thierer, “Erasing Our Past on the Internet,” Forbes, April 17, 2011.
      Adam Thierer, “Unappreciated Benefits of Advertising and Commercial Speech,” Mercatus on Point 86, Mercatus Center, January 2011.
      BerinSzoka and Adam Thierer, “COPPA 2.0: The New Battle over Privacy, Age Verification, Online Safety & Free Speech,” Progress on Point 16, no.11, The Progress & Freedom Foundation, May 21, 2009.
      31