We need to move beyond a black and white understanding of “privacy” as absolute information lock-down, where the level of information shared is inversely related to “how much” privacy you have. People today want to share information. Lots of it. We get that. But people also want to share information in trusted environments. How do we build trusted environments? With transparency, accountability and choice. This is why need to understand “privacy” a system that supports these principles and not an absolute quality. Some people will share a lot if given the opportunity, some will share very little. But in a system where these principles are supported all have “privacy”. “Truth in privacy” is a commitment to building these trusted systems that provide users with transparency, accountability and choice.
In February 2010 Google introduced “Buzz”, a social-networking extension of its popular email service, Gmail. By default, Buzz publicly disclosed a Gmail user’s most frequently emailed contact. There was a prominent news story of a woman whose most frequently emailed contacts were exposed to her abusive ex-husband. Google buzz failed to provide adequate choice, because it was choice after the fact, rendering the choice meaningless. They also had to learn to the opt-out/opt-in rule: if you’re going to change the visibility/sharing settings for previously collected personal data (e.g. email addresses), you need to make it an opt-in process for users. Within weeks Google had updated Buzz to make contact list sharing an explicit opt-in feature.
In April 2010 Facebook released Instant Personalization, an Internet-wide “Like” button and the concept of the Social Graph. Facebook also made users “Likes” public information by default. These changes evoked a fierce and largely negative privacy reaction from many of its users, as well as intense scrutiny from the media and regulators who argued that the site’s privacy settings were too obtuse or inadequate to allow consumers to make meaningful choices about the privacy of their information. A month later Facebook responded with a simplified privacy control panel, reduced the amount of basic info that must be visible to everyone and gave users greater control over 3 rd party applications and websites. They also provided users with a global opt-out choice for the Instant Personalization feature.
Source: TRUSTe Brand Survey 2009. Not publicly available. Slide #5 Source: “Future of Privacy Forum Online Behavioral Advertising “Icon” Study”. January 25, 2010 http://futureofprivacy.org/final_report.pdf Source: Ibid. Source: http://www.truste.com/about_TRUSTe/press-room/news_truste_smb_neglect_privacy.html
These icons were designed by Aza Raskin of Mozilla as part of the company’s privacy icon project. See: http://www.azarask.in/blog/post/privacy-icons/#
Source: KPMG Mobile Banking Survey 2009 of 4,190 mobile device users Source: Ibid. Source: Geolocation Survey conducted by Webroot on July 13, 2010
1. Source: We tested TRUSTed Ads, TRUSTe’s behavioral advertising notice and choice product on a Publisher’s Clearing House webpage over the course of a few months. Over millions of impressions we measured an opt-out rate of less than 1 percent. Feedback we collected from a consumer survey suggests that what consumers really want is transparency and choice; they don’t actually want to opt-out. Is this counterintuitive? No. It tells us what consumers really want is to have control again, control they feel they’ve lost over the years as online tracking activities have proliferated in number and complexity. So the lesson is that consumers don’t need to opt-out to feel in control. The choice to opt-out alone is powerful enough to restore that sense of control and build trust. The most common feedback we received from the survey was that consumers don’t like ads. Yeah, we already knew that. You know they like, though? Free, ad-supported content, services, and entertainment.
1. <ul><li>Truth in Privacy: the Year Ahead </li></ul><ul><li>Chris Babel </li></ul><ul><li>CEO </li></ul><ul><li>TRUSTe </li></ul>
2. What is “Truth in Privacy” ? <ul><li>Definition </li></ul><ul><li>Providing transparency, choice and accountability when collecting and using personal information. </li></ul><ul><li>Transparency – what are you going to do with my data? </li></ul><ul><li>Choice – where can I make decisions about my data use before , not after the fact? </li></ul><ul><li>Accountability – how will you protect my data and respond in the event of breach or misuse? </li></ul>
3. What Does It Look Like In Practice? Transparency Accountability Choice
4. Lessons Learned In 2010: Google Buzz Before After What they fixed: Choice
5. Lessons Learned In 2010: Facebook’s Social Graph Before After What they fixed: Transparency and Choice
6. Looking Forward <ul><li>Why and </li></ul><ul><li>where can we </li></ul><ul><li>better implement </li></ul><ul><li>these principles </li></ul><ul><li>in 2011? </li></ul>Transparency Accountability Choice
10. 2011: Mobile Privacy Concerns Are Prevalent 87% of consumers are concerned about privacy and security on mobile devices 1 55% of consumers fear loss of privacy through mobile apps and geo-location services. 2 66% of consumers Are not comfortable using their mobile device for financial transactions 1 <ul><li>KPMG Mobile Banking Survey 2009 of 4,190 mobile device users </li></ul><ul><li>Geolocation Survey conducted by Webroot on July 13, 2010 </li></ul>
11. 2011: Offer Choices Around Mobile Location Data Use
12. 2011: Adapt Privacy Notices To The Mobile Screen
13. 2011: With Apps, It’s Not Just About Mobile Social Apps Browser Apps Desktop Apps
14. 2011: Advertising Transparency And Choice Advertising Option Icon
15. 2011: Provide Advertising Privacy Choices
16. 2011: Don’t Fear The Opt-out <ul><li>What happens when millions of consumers are offered the choice to opt-out of online tracking? </li></ul><ul><li>~1% click the icon 1 </li></ul><ul><li>~1% changed preferences 1 </li></ul><ul><li>Results from testing of TRUSTed Ads on www.pchlotto.com </li></ul>