-
1.
©2015CarnegieMellonUniversity:1
PrivacyGrade and
Social Cybersecurity
Jason Hong
Federal Trade Commission
July 9, 2015
Computer
Human
Interaction:
Mobility
Privacy
Security
-
2.
©2015CarnegieMellonUniversity:2
Talk Overview
• PrivacyGrade
– Analyzing the privacy of 1M
smartphone apps
• Social Cybersecurity
– Using social psych to influence
people’s cybersecurity behaviors
-
3.
©2015CarnegieMellonUniversity:3
What Are Your Apps Really Doing?
Shares your location,
gender, unique phone ID,
phone# with advertisers
Uploads your entire
contact list to their server
(including phone #s)
-
4.
©2015CarnegieMellonUniversity:4
Many Smartphone Apps Have
“Unusual” Permissions
Location Data
Unique device ID
Location Data
Network Access
Unique device ID
Location Data
Microphone
Unique device ID
-
5.
©2015CarnegieMellonUniversity:5
What Do Developers Know
about Privacy?
• Interviews with 13 app developers
• Surveys with 228 app developers
• What tools? Knowledge? Incentives?
• Points of leverage?
Balebako et al, The Privacy and Security Behaviors
of Smartphone App Developers. USEC 2014.
-
6.
©2015CarnegieMellonUniversity:6
Summary of Findings
Third-party Libraries Problematic
• Use ads and analytics to monetize
-
7.
©2015CarnegieMellonUniversity:7
Summary of Findings
Third-party Libraries Problematic
• Use ads and analytics to monetize
• Hard to understand their behaviors
– A few didn’t know they were using
libraries (inconsistent answers)
– Some didn’t know they collected data
– “If either Facebook or Flurry had a
privacy policy that was short and
concise and condensed into real
English rather than legalese, we
definitely would have read it.”
-
8.
©2015CarnegieMellonUniversity:8
Summary of Findings
Devs Don’t Know What to Do
• Low awareness of existing privacy
guidelines
– Often just ask others around them
• Low perceived value of privacy
policies
– Mostly protection from lawsuits
– “I haven’t even read [our privacy
policy]. I mean, it’s just legal stuff
that’s required, so I just put in there.”
-
9.
©2015CarnegieMellonUniversity:9
PrivacyGrade.org
• Improve transparency
• Assign privacy grades to
all 1M+ Android apps
-
10.
©2015CarnegieMellonUniversity:10
-
11.
©2015CarnegieMellonUniversity:11
-
12.
©2015CarnegieMellonUniversity:12
-
13.
©2015CarnegieMellonUniversity:13
-
14.
©2015CarnegieMellonUniversity:14
Expectations vs Reality
-
15.
©2015CarnegieMellonUniversity:15
Privacy as Expectations
Use crowdsourcing to compare what
people expect an app to do vs what
an app actually does
App Behavior
(What an app
actually does)
User Expectations
(What people think
the app does)
-
16.
©2015CarnegieMellonUniversity:16
How PrivacyGrade Works
• Long tail distribution of libraries
• We focused on top 400 libraries
-
17.
©2015CarnegieMellonUniversity:17
How PrivacyGrade Works
• We crowdsourced people’s
expectations of core set of 837 apps
– Ex. “How comfortable are you with
Drag Racing using your location for ads?”
• Created a model to predict people’s
likely privacy concerns
• Applied model to 1M Android apps
-
18.
©2015CarnegieMellonUniversity:18
Overall Stats on PrivacyGrade
April 2015
• No sensitive permissions used
means A+
• Other grades
set at quartiles
of grade range
-
19.
©2015CarnegieMellonUniversity:19
Changes in Grades Over Time
October 2014 to April 2015
-
20.
©2015CarnegieMellonUniversity:20
Changes in Grades Over Time
Most Grades Remained the Same
-
21.
©2015CarnegieMellonUniversity:21
Changes in Grades Over Time
A Fair Number of Apps Improved
-
22.
©2015CarnegieMellonUniversity:22
Changes in Grades Over Time
Lots of Apps Deleted
• Not sure why deleted yet
– Some apps were re-uploaded
-
23.
©2015CarnegieMellonUniversity:23
Impact of this Research
• Popular Press
– NYTimes, CNN, BBC, CBS, more
• Government
– Earlier work helped lead to FTC fines
– Scared some Congressional staffers
• Google
• Developers
-
24.
©2015CarnegieMellonUniversity:24
Social Cybersecurity
• New work looking at changing
people’s awareness, knowledge,
and motivation to be secure
• Tool for FTC and companies to use
to improve privacy and security
-
25.
©2015CarnegieMellonUniversity:25
Social Proof
-
26.
©2015CarnegieMellonUniversity:26
• Baseline effectiveness is 35%
-
27.
©2015CarnegieMellonUniversity:27
-
28.
©2015CarnegieMellonUniversity:28
• “showing each user pictures of friends who
said they had already voted, generated
340,000 additional votes nationwide”
• “they also discovered that about 4 percent of
those who claimed they had voted were not
telling the truth”
-
29.
©2015CarnegieMellonUniversity:29
Adoption of Cybersecurity
Features is Very Low
• Typically single digits
– Two-factor authentication
– Login notifications on Facebook
– Trusted contacts on Facebook
-
30.
©2015CarnegieMellonUniversity:30
Insight from Interviews
Observability of Adoption Low
• One person stopped in coffee shop
and asked about the Android 9-dot:
“We were just sitting in a
coffee shop and I wanted
to show somebody
something and [they said], ‘
My phone does not have
that,’ and I was like, ‘I
believe it probably does.’”
-
31.
©2015CarnegieMellonUniversity:31
Diffusion of Innovations
• Five major factors
for successful
innovations:
– Relative Advantage
– Trialability
– Complexity
– Compatibility
– Observability
-
32.
©2015CarnegieMellonUniversity:32
Social Proof + Making
Cybersecurity Observable
• Variants
– Control
– Over # / %
– Only # / %
– Raw # / %
– Some
Das, S., A. Kramer, L. Dabbish, J.I. Hong. Increasing Security Sensitivity
With Social Proof: A Large-Scale Experimental Confirmation. CCS 2014.
-
33.
©2015CarnegieMellonUniversity:33
Method
• Controlled, randomized study
with 50k active Facebook users
– 8 conditions, so N=6250
• Part of annual security awareness
campaign Facebook was going to
run anyway
-
34.
©2015CarnegieMellonUniversity:34
Results of Experiment
-
35.
©2015CarnegieMellonUniversity:35
Summary
• PrivacyGrade
– Analyzing the privacy of 1M apps
• Social Cybersecurity
– Social proof + observability to improve
cybersecurity behaviors
-
36.
©2015CarnegieMellonUniversity:36
Thanks!
Collaborators:
Special thanks to:
• Army Research Office
• National Science Foundation
• Alfred P. Sloan Foundation
• Google
• CMU Cylab
• NQ Mobile
• Shah Amini
• Kevin Ku
• Jialiu Lin
• Song Luan
• Bharadwaj Ramachandran
• Norman Sadeh
-
37.
©2015CarnegieMellonUniversity:37
How PrivacyGrade Works
-
38.
©2015CarnegieMellonUniversity:38
Limitations of Current
Approach
• PrivacyGrade works for most apps
– But popular apps, lots of custom code
– Also can’t analyze backend
• Only free apps
– Limitations on downloading paid apps
• Assume most libraries have one
purpose
– True for vast majority
– More analytics + advertising combos
-
39.
©2015CarnegieMellonUniversity:39
Talk Overview
• Interviews and surveys of app
developers
• PrivacyGrade.org
• Using text mining to infer
privacy-related app behaviors
• Reflections on privacy ecosystem
-
40.
©2015CarnegieMellonUniversity:40
Reflections on Privacy
Consider entire ecosystem
• End-users
– Most research has focused here
– But puts too much burden
– Really hard to improve awareness,
knowledge, and motivation
-
41.
©2015CarnegieMellonUniversity:41
Reflections on Privacy
Consider entire ecosystem
• End-users
• Developers
• Third-party developers
• Markets
• OS
• Third-party advocates
– Ex. FTC, Consumer Reports
-
42.
©2015CarnegieMellonUniversity:42
Reflections on Privacy
Helping Developers
• Point of greatest leverage
• Examples:
– Better understanding of 3rd party libs
– Better design patterns for privacy
– Better APIs
• “Home” or “work” vs precise location
– Better reusable components
• Databases and ACID properties
• Make the path of least resistance
privacy sensitive
-
43.
©2015CarnegieMellonUniversity:43
Mobile App
• Scans apps you
have on phone,
gets grades from
our site
• Just need to
add it to
Google Play store
Lin et al, Expectation and Purpose: Understanding User’s Mental Models of Mobile App Privacy thru Crowdsourcing. Ubicomp 2012.
http://www.cmuchimps.org/publications/expectation_and_purpose_understanding_users_mental_models_of_mobile_app_privacy_through_crowdsourcing_2012/pub_download
Moto Racing / https://play.google.com/store/apps/details?id=com.motogames.supermoto
http://www.cmuchimps.org/publications/the_privacy_and_security_behaviors_of_smartphone_app_developers_2014/pub_download
On the left is Nissan Maxima gear shift. It turns out my brother was driving in 3rd gear for over a year before I pointed out to him that 3 and D are separate. The older Nissan Maxima gear shift on the right makes it hard to make this mistake.
INTERNET, READ_PHONE_STATES, ACCESS_COARSE_LOCATION, ACCESS_FINE_LOCATION, CAMERA, GET_ACCOUNTS,
SEND_SMS, READ_SMS, RECORD_AUDIO, BLUE_TOOTH and READ_CONTACT
Lin et al, Modeling Users’ Mobile App Privacy Preferences: Restoring Usability in a Sea of Permission Settings. SOUPS 2014.
INTERNET, READ_PHONE_STATES, ACCESS_COARSE_LOCATION, ACCESS_FINE_LOCATION, CAMERA, GET_ACCOUNTS, SEND_SMS, READ_SMS, RECORD_AUDIO, BLUE_TOOTH and READ_CONTACT
The draw of the crowd is devilishly strong
There have been studies demonstrating that if you have lots of people looking up, pretty much every passerby will too
http://www.carlsonschool.umn.edu/assets/118359.pdf
Baseline environmental message was 35%
Das, S., A. Kramer, L. Dabbish, J.I. Hong. Increasing Security Sensitivity With Social Proof: A Large-Scale Experimental Confirmation. In The 21st ACM Conference on Computer and Communications Security (CCS 2014). 2014. [19.5% accept rate]http://www.cmuchimps.org/publications/increasing_security_sensitivity_with_social_proof_a_large_scale_experimental_confirmation_2014
Results more subtle than presented in this table, see the CCS 2014 paper for details.
The basics are there though, that social conditions worked better than control in almost every case
DARPA
Google
CMU CyLab
Big Data
Crowd
Scale
Ecosystem -> developers / FTC