The document discusses the challenges of applying benefit-cost analysis (BCA) to regulations regarding online safety and privacy. It notes that defining harm is difficult, as many privacy and safety issues involve intangible or subjective harms that are hard to quantify. It also outlines the costs that must be considered in BCA, such as lost opportunities from new services or restrictions on data collection. The document argues that alternatives to regulation, such as education, empowering users with privacy-enhancing tools, and targeted enforcement, should be seriously considered in BCA before preemptive regulation is pursued.
The Challenge of Benefit-Cost Analysis As Applied to Online Safety & Digital Privacy
1. The Challenge of Benefit-Cost Analysis As
Applied to Online Safety & Digital Privacy
Adam Thierer
Senior Research Fellow
Mercatus Center at George Mason University
January 17, 2012
Presentation before George Mason University Law &
Economics Center conference on Privacy, Regulation, &
Antitrust
2. Purpose of Talk / Paper
• Explore challenges of applying benefit-cost analysis
(BCA) to privacy & online safety
– in many ways, they are really same debate
• Discuss particular problem of defining harm (and,
correspondingly, the benefits of regulation)
• Outline the range of costs that must be considered
(even if they prove difficult to quantify)
• Focus on the many alternatives to regulation
• Explain how the toolbox of solutions we apply to
online safety can work for privacy
2
3. Prefatory Note:
Strict Rights Approach Limits BCA
• If privacy & safety are “dignity” rights, then it trumps
all other considerations & BCA goes out the window
• But at least here in the U.S. the calculus has been
more utilitarian in character
• Better to think about privacy & safety the way we
think about happiness: You have the right to pursue
it, not so much a strict right to it
• So, BCA remains important
3
5. The Triumph of Benefit-Cost Analysis
“It is not exactly news that we live in an era of
polarized politics. But Republicans and Democrats
have come to agree on one issue: the essential need
for cost- benefit analysis in the regulatory process. In
fact, cost-benefit analysis has become part of the
informal constitution of the U.S. regulatory state.
This is an extraordinary development.”
- Cass Sunstein (2012)
5
6. The Basics of RIA / BCA
• Since early 1980s, regulatory impact
assessments (RIAs) have been required
• Executive Order 12291 (Reagan)
• Executive Order 12866 (Clinton)
• Executive Order 13563 & 13610 (Obama)
• OMB Circular A-4
– 3 core elements of an RIA…
6
7. RIA Step #1:
Statement of need for the regulatory action
• a clear explanation of need for the regulatory
action
• “Agencies should explain whether the action is
intended to address a market failure or to
promote some other goal, such as improving
governmental processes, protecting privacy,
or combating discrimination.”
7
8. RIA Step #2:
Identify range of regulatory approaches
• agencies must describe “range of alternative
regulatory approaches, including the option of not
regulating.”
• “should consider flexible approaches that reduce
burdens and maintain freedom of choice”
• “should specify performance objectives, rather than
specifying the behavior or manner of compliance that
regulated entities must adopt.”
8
9. RIA Step #3:
Conduct the Benefit-Cost Analysis
• estimates the benefits & costs associated with each
alternative approach.
• costs should be quantified and monetized to the
extent possible
• when quantification of a particular benefit or cost is
not possible, it should be described qualitatively.
• where relevant, consider “values such as equity,
human dignity, fairness, potential distributive impacts,
privacy, and personal freedom.”
9
10. Possible Privacy Regs to Consider
• “Do Not Track” mandate
• Mandatory default switches
– Opt-in mandates
• Bans on specific practices
– 3rd party data sharing / aggregation
– Deep packet inspection
• Targeted privacy laws
– COPPA
Note: Much harder to do BCA when these things are
being “nudged” instead of formally mandated.
10
11. The Problem of Defining the Problem
(and how that complicates the question of
benefits of regulation)
11
12. Privacy & Safety Metrics
= A Nightmare
There are no good, widely agreed upon
metrics by which to measure online safety &
digital privacy “harms.”
12
13. How Privacy & Safety Harms
are Described in Popular Culture
Online Safety “Harms”
• Offensive
• Smutty
• Filthy
• Hurtful / Hateful
Privacy “Harms”
• Invasive
• Manipulative
• Annoying
• Creepy
13
** None of these things are even remotely quantifiable **
14. How Privacy & Safety Harms
are Described in the Literature
• Tangible vs. Intangible
• Objective vs. Subjective
• Direct vs. Indirect
• Monetary vs. Non-Monetary
• Quantifiable vs. Unquantifiable
• Physical vs. Psychological
14
15. In a perfect world…
Intangible Harm Tangible Harm
Monetizable Harm Easier case
15
Non-Monetary Harm Hard case
16. 16
Intangible Tangible
Monetizable Harm
Data Breach Invasion / Trespass
Non-Monetary Harm
“Creepiness”
“Offensiveness”
Stalking
and sometimes that works …
… but most of the time it doesn’t.
17. The Key (very, very hard) Question
To what extent are harms that purely
psychological in character really harms at all?
• Clearly, some can be
– Incessant digital harassment / cyber-bullying
• But most not harmful in a traditional sense
– If “creepiness” is an actionable harm, then the Net
as we know it would have to be shut down
17
18. The hopeless subjectivity of it all…
• One person’s “creepy” is another person’s
“killer app.”
• Worse yet, even our own actions don’t match
up with our professed values!
• Stated preferences vs. revealed preferences problem
18
19. What’s Your Privacy Indifference Curve
Look Like?
19
Privacy
Sociability
and / or
Services
privacy
unconcerned
privacy
fundamentalist
Many people
who say they are
here, actually
seem to be here.
20. The Privacy Paradox
“Ask 100 people if they care [about
privacy] and 85 will say yes. Ask those
same 100 people if they’ll give you a DNA
sample just to get a free Big Mac, and 85
will say yes.”
- Austin Hill, 2002
20
21. What about “peace of mind” arguments?
• “peace of mind” / “user trust” = primary benefit?
– i.e. more privacy & safety regs make consumers more
comfortable getting online
– Analogy: post – 9/11 security mandates
• But they had opposite impact! (see J. Law & Econ. 2007)
• Problem with “peace of mind” = again, how to prove it
– More people online than ever
• Plus, could cut other way (i.e., “free” services drive
subscribership / usage)
• Natural experiments? (US vs. EU?)
– Problem = many confounding variables
21
22. A Value Exchange without
Formal Contracting
• Popular belief = If you’re not buying the
product, you are the product.
• Only partially correct.
• Reality = You can be both part of value
exchange & the recipient of the benefits of
that value exchange.
• Past examples = traditional broadcast &
newspaper media advertising
22
23. The Current Value Exchange
• Almost all online media / services driven by a
simple quid pro quo…
– Data collection / targeted advertising in exchange
for “free” (or cheap) content and services
– But hard to precisely quantify this value exchange
23
24. Market Failure = Lack of Options?
• Why no “Facebook Private” or “YouTube
Family Friendly”
• Actually, those sites do exist (by other names
& from other vendors), but there is very little
demand for them
+ (as summarized later) lots of PETs exist
+ no requirement you use any of these
services (Facebook isn’t a forced labor camp!)
24
25. How do we factor social adaptation
into BCA?
• People adapt! (sometimes quicker than we imagine)
• Today’s “technopanic” is tomorrow’s widely accepted
technology / business practice
– Examples: photography, caller ID, Gmail
• 2 general lessons:
1. Cautions against knee-jerk regulatory responses
2. BCA should account for rapid social / cultural adjustment
to new info-tech
25
26. When All Else Fails…
Claim False Consciousness as Harm!
• Manipulation! Mind control!
Siva Vaidhyanathan:
– consumers are being tricked by the “smokescreen” of “free”
online services and “freedom of choice.”
– “We are conditioned to believe that having more choices…
is the very essence of human freedom. But meaningful
freedom implies real control over the conditions of one’s
life.”
• Such “people are sheep” arguments can’t stand in a
free society (or be used in BCA)
26
28. What Data Collection Has Enabled
• Transport yourself back a decade and consider how
far we’ve come
• Things that used to cost $$ and were capacity-capped:
– Email
– Data & document storage
– Photo hosting
– Mapping services
– Security / anti-virus software
– Online bulletin boards / hobby pages
– Most online news
28
29. The Magic of Data & Advertising
29
• Lowers Price: None of those services cost us a dime
now
• Expands Quantity: There are more of those services
today
• Improves Quality: All of those service are more
diverse or innovative than before
Data collection & advertising made it all possible
= a huge boon for consumer welfare.
30. Possible Regulatory Costs BCA Must Consider
• Will regulation tip the balance between business models?
• instead of ad & data supported online sites and services, we
could get…
1) More intrusive but untargeted ads? (banner, pop-up, etc.)
2) Pay-per-use or subscription-based?
3) Restricted output / lower quality sites?
4) Mix of all of the above?
• This has both micro and macro consequences
– For industry: If not targeted data collection / advertising, what will
fund online content and services in the future?
– For consumers: If those other methods fail to work = less content &
services or more expensive content & services
• As always, there is no free lunch
30
31. Some Good Studies Showing Trade-offs
• Beales
– targeted advertising 2.68 times the price of run of
network advertising
• Tucker & Goldfarb
– EU Privacy Directive = 65% decrease in advertising
effectiveness in Europe
– Because regulation decreases ad effectiveness,
“this may change the number and types of
businesses sustained by the advertising-supporting
Internet.”
31
32. WTP Literature is Very Thin
• Perhaps not surprising since
– People aren’t paying anything in real world!
– But polls / surveys make crappy proxies
– Stated preferences vs. revealed preferences problem
• WTP study by CMU (2007)
– price matters a lot, but some consumers willing to
pay more for privacy when better informed
32
33. ENISA Study
• European Network & Information Security
Agency (ENISA) “Study on Monetizing
Privacy” (Feb. 2012)
– Pointed out lack of real-world experiments
as major problem
• “a large share of literature is devoted to [surveys]” &
“economic experiments that implement real purchase
transactions are rather scarce”
• “there are no works in economics that combine theoretical
and experimental methods for the analysis of the interplay of
privacy concerns, product personalization and competition.”
33
34. ENISA Results
• combined lab & field experiments (cell phone number for
movie ticket discount)
Results:
• Price matters!
– Most consumers bought from “privacy-invasive” providers if price lower
(50 Euro cents)
• Privacy matters (at least a little)!
– 29% would pay extra to avoid giving up cell number; 9% would pay more
to avoid giving up email.
– And fully 80% preferred privacy-friendly option if no price difference at
all.
34
35. In sum, the problem we face (re:
determining costs / WTP) …
“Empirical research on the subject is still in its
infancy. Most studies ask for personal opinion, rather
than measure the digital choices people make, and
even there, the results usually find a gap between what
people say and what they do about their privacy
online.”
- Somini Sengupta, NYT (3/19/12)
>> BCA has to account for this & try to better measure
real-world choices & trade-offs, not polls & surveys.
35
36. Other Costs / Lost Outputs to Consider
• International competitiveness
– Has privacy regulation limited Europe’s online market?
– Could regulation here diminish U.S. competitive advantage
relative to world?
• Market structure / competition
– Privacy mandates could lead to industry consolidation
• Speech considerations
– Many online safety & privacy regulations raise thorny free
speech / 1st Amendment issues
– Privacy & online safety regs could limit aggregate amount
of speech produced
36
38. Less Restrictive Alternatives to Regulation
• Education
• Empowerment
• (Targeted) Enforcement Efforts
• Evolving Norms / Social Adaptation
This is the model we use today for online safety
concerns.
– Why not apply it to privacy concerns as well?
– BCA must at least take these alternatives into account
38
39. Education / Awareness-Building
• Education & Awareness campaigns at all levels
(from privacy advocates, govt, industry,
educators, etc.)
• Encouraging better online “netiquette” and “data
hygiene”
• Push for better transparency across the board
39
– Better notice & labeling
– Need more watch-dogging of privacy promises made
by companies
• Govt can play key role here (ex: PSAs, help sites)
42. 42
Best Practice Guidelines
Query: Are such “non-law law” nudges & agency threats legal / sensible?
(see forthcoming Randy Picker paper)
43. Empowerment / Privacy-Enhancing
Technologies (PETs)
43
= Helping users help themselves
• User “self-help” tools are multiplying (next slide)
• Industry self-regulation
– More cross-industry collaboration on privacy
programs
– Better notice
– Best practices & smarter defaults
– More and better tools to respond to new
developments and needs
44. Other Tools
• AdBlockPlus
• Ghostery
• NoScript
• Cookie Monster
• Better Privacy
• No More Cookies
• CCleaner
• Flush
• Priveazy
• Privacyfix
• PrivacyWatch
44
Ad Preference Managers
• Google, MS, Yahoo
• DuckDuckGo
Private Browsing Mode
• Google “Incognito”
• IE “InPrivate Browsing”
• Firefox “Private Browsing”
• Safari “Private Browsing”
Do Not Track
• Now in all major browsers
• + DNT add-ons from others
46. The AdBlock Plus Story
• Most demanded browser add-on in history
• roughly 175 million people downloaded
Firefox version (as of Oct. 2012)
• Shows a clear demand for PETs
• also shows that there are powerful ways for
privacy-sensitive users to handle the problem
• (Implications for contracting debate?)
46
47. Enforcement (Targeted)
• Holding companies to the promises they make
47
– stepped-up FTC Sec. 5 enforcement
• Demand better notice & transparency
• Mandatory disclosure of data breaches
• Targeted regulation of sensitive data, but with
flexibility
49. The Legal Standard That Should
Govern for Both Safety & Privacy
• In 2000, the Sup. Ct. struck down a requirement that
cable companies “fully scramble” video signals that
include sexually explicit content (U.S. v. Playboy)
– “Simply put, targeted blocking is less restrictive than
banning, and the Government cannot ban speech if
targeted blocking is a feasible and effective means of
furthering its compelling interests.”
– “It is no response that voluntary blocking requires a
consumer to take action, or may be inconvenient, or may
not go perfectly every time. A court should not assume a
plausible, less restrictive alternative would be ineffective;
and a court should not presume parents, given full
information, will fail to act.”
49
50. In other words…
• If effective privacy-enhancing tools & options
exist, they must be factored into the BCA
process
• Weighs heavily against use of preemptive
regulation, especially in the absence of more
concrete harms
• But other govt action still possible
50
51. Other Govt Options
• Privacy awareness-building (PSAs + websites)
• Digital literacy & labeling programs
– nutritional labeling model
• Govt-created privacy & safety apps?
• Govt-created / underwritten search engine?
– PBS model for online media / services
• FTC oversight
– Sec. 5 enforcement
– best practice guidance?
51
52. How is the Govt Doing On This Front?
• Not enough focus on BCA in privacy reports,
consent decrees, or COPPA revisions
– Regulation treated as a costless exercise
• Strong focus on best practices / nudging
industry (but still little discussion of costs)
• Good awareness-building efforts best practices
reports, online advice (“OnGuardOnline” PSA
efforts)
• No govt-provided privacy apps / tools
52
53. Going Forward…
Is Formal Contracting Possible?
• Could users contract w sites for privacy / PII “rights”?
• If users started using more PETs & actively thwarted data
collection / advertising, then perhaps some sort of Coasean
bargaining would develop
– Ex: if 25% were using DNT & AdBlockPlus = tipping point?
• But is it possible without formal “propertiziation” of PII??
– Doubtful more sophisticated contracting schemes develop without
formal propertiziation?
– Licensing likely to have same problems / limitations as seen in
copyright context
• Transaction costs matter! (could hinder positive
developments)
53