Privacy is the most often-cited criticism of ubiquitous computing, and may be the greatest barrier to its long-term success. However, developers currently have little support in designing software architectures and in creating interactions that are effective in helping end-users manage their privacy. To address this problem, we present Confab, a toolkit for facilitating the development of privacy-sensitive ubiquitous computing applications. The requirements for Confab were gathered through an analysis of privacy needs for both end-users and application developers. Confab provides basic support for building ubiquitous computing applications, providing a framework as well as several customizable privacy mechanisms. Confab also comes with extensions for managing location privacy. Combined, these features allow application developers and end-users to support a spectrum of trust levels and privacy needs.
Intelligent Agents for Helping Humanity Reach Its Full PotentialJason Hong
Within fifty years, we will build and deploy highly personalized intelligent agents that can help us find, set, and meet hard goals to improve our lives in meaningful ways that we choose. Think of it as a cross between a lifelong coach, a caring uncle, and an honest and supportive friend. Or, if you are into science fiction, consider it as a combination of Samantha in the movie Her, the Young Lady’s Primer from the book The Diamond Age, and Minds from Iain Bank’s The Culture series. Let’s call this agent Maslow.
An Architecture for Privacy-Sensitive Ubiquitous Computing at Mobisys 2004Jason Hong
Some older research I did looking at one way of building privacy-sensitive apps for ubiquitous computing environments. The core idea is to focus on locality, where all of the data is sensed and processed locally as much as possible.
Privacy is the most often-cited criticism of ubiquitous computing, and may be the greatest barrier to its long-term success. However, developers currently have little support in designing software architectures and in creating interactions that are effective in helping end-users manage their privacy. To address this problem, we present Confab, a toolkit for facilitating the development of privacy-sensitive ubiquitous computing applications. The requirements for Confab were gathered through an analysis of privacy needs for both end-users and application developers. Confab provides basic support for building ubiquitous computing applications, providing a framework as well as several customizable privacy mechanisms. Confab also comes with extensions for managing location privacy. Combined, these features allow application developers and end-users to support a spectrum of trust levels and privacy needs.
Authors are Jason Hong and James Landay
Topiary: A Tool for Prototyping Location-Enhanced Applications, at UIST 2004Jason Hong
A tool we created for rapidly prototyping location-enhanced apps. The key idea is to use a few basic abstractions at design time to support location features, and then to use a Wizard of Oz approach at run time to help with testing.
Location-enhanced applications use the location of people, places, and things to augment or streamline interaction. Location-enhanced applications are just starting to emerge in several different domains, and many people believe that this type of application will experience tremendous growth in the near future. However, it currently requires a high level of technical expertise to build location-enhanced applications, making it hard to iterate on designs. To address this problem we introduce Topiary, a tool for rapidly prototyping location-enhanced applications. Topiary lets designers create a map that models the location of people, places, and things; use this active map to demonstrate scenarios depicting location contexts; use these scenarios in creating storyboards that describe interaction sequences; and then run these storyboards on mobile devices, with a wizard updating the location of people and things on a separate device. We performed an informal evaluation with seven researchers and interface designers and found that they reacted positively to the concept.
Authors are Yang Li, Jason Hong, and James Landay
Increasing Security Sensitivity With Social Proof: A Large-Scale Experimenta...Jason Hong
One of the largest outstanding problems in computer security is the need for higher awareness and use of available security tools. One promising but largely unexplored approach is to use social proof: by showing people that their friends use security features, they may be more inclined to explore those features, too. To explore the efficacy of this approach, we showed 50,000 people who use Facebook one of 8 security announcements—7 variations of social proof and 1 non-social control—to increase the exploration and adoption of three security features: Login Notifications, Login Approvals, and Trusted Contacts. Our results indicated that simply showing people the number of their friends that used security features was most effective, and drove 37% more viewers to explore the promoted security features compared to the non-social announcement (thus, raising awareness). In turn, as social announcements drove more people to explore security features, more people who saw social announcements adopted those features, too. However, among those who explored the promoted features, there was no difference in the adoption rate of those who viewed a social versus a non-social announcement. In a follow up survey, we confirmed that the social announcements raised viewer’s awareness of available security features.
Teaching Johnny not to Fall for Phish, at APWG CeCOS 2009Jason Hong
An overview of our group's work on teaching people not to fall for phishing attacks, using simulated phish. The summary is that simulated phish work surprisingly well, in terms of learning and retention.
Introduction to User Experience and User Interface Design: A One-Hour Crash C...Jason Hong
A one-hour crash course on UX design and User Interface Design. I talk about methods for understanding users (contextual inquiry, diary studies, bodystorming), basic design principles (layout, color, mental models, grid), rapid prototyping (building user interfaces quickly, paper prototypes), and evaluation (heuristic evaluation).
Intelligent Agents for Helping Humanity Reach Its Full PotentialJason Hong
Within fifty years, we will build and deploy highly personalized intelligent agents that can help us find, set, and meet hard goals to improve our lives in meaningful ways that we choose. Think of it as a cross between a lifelong coach, a caring uncle, and an honest and supportive friend. Or, if you are into science fiction, consider it as a combination of Samantha in the movie Her, the Young Lady’s Primer from the book The Diamond Age, and Minds from Iain Bank’s The Culture series. Let’s call this agent Maslow.
An Architecture for Privacy-Sensitive Ubiquitous Computing at Mobisys 2004Jason Hong
Some older research I did looking at one way of building privacy-sensitive apps for ubiquitous computing environments. The core idea is to focus on locality, where all of the data is sensed and processed locally as much as possible.
Privacy is the most often-cited criticism of ubiquitous computing, and may be the greatest barrier to its long-term success. However, developers currently have little support in designing software architectures and in creating interactions that are effective in helping end-users manage their privacy. To address this problem, we present Confab, a toolkit for facilitating the development of privacy-sensitive ubiquitous computing applications. The requirements for Confab were gathered through an analysis of privacy needs for both end-users and application developers. Confab provides basic support for building ubiquitous computing applications, providing a framework as well as several customizable privacy mechanisms. Confab also comes with extensions for managing location privacy. Combined, these features allow application developers and end-users to support a spectrum of trust levels and privacy needs.
Authors are Jason Hong and James Landay
Topiary: A Tool for Prototyping Location-Enhanced Applications, at UIST 2004Jason Hong
A tool we created for rapidly prototyping location-enhanced apps. The key idea is to use a few basic abstractions at design time to support location features, and then to use a Wizard of Oz approach at run time to help with testing.
Location-enhanced applications use the location of people, places, and things to augment or streamline interaction. Location-enhanced applications are just starting to emerge in several different domains, and many people believe that this type of application will experience tremendous growth in the near future. However, it currently requires a high level of technical expertise to build location-enhanced applications, making it hard to iterate on designs. To address this problem we introduce Topiary, a tool for rapidly prototyping location-enhanced applications. Topiary lets designers create a map that models the location of people, places, and things; use this active map to demonstrate scenarios depicting location contexts; use these scenarios in creating storyboards that describe interaction sequences; and then run these storyboards on mobile devices, with a wizard updating the location of people and things on a separate device. We performed an informal evaluation with seven researchers and interface designers and found that they reacted positively to the concept.
Authors are Yang Li, Jason Hong, and James Landay
Increasing Security Sensitivity With Social Proof: A Large-Scale Experimenta...Jason Hong
One of the largest outstanding problems in computer security is the need for higher awareness and use of available security tools. One promising but largely unexplored approach is to use social proof: by showing people that their friends use security features, they may be more inclined to explore those features, too. To explore the efficacy of this approach, we showed 50,000 people who use Facebook one of 8 security announcements—7 variations of social proof and 1 non-social control—to increase the exploration and adoption of three security features: Login Notifications, Login Approvals, and Trusted Contacts. Our results indicated that simply showing people the number of their friends that used security features was most effective, and drove 37% more viewers to explore the promoted security features compared to the non-social announcement (thus, raising awareness). In turn, as social announcements drove more people to explore security features, more people who saw social announcements adopted those features, too. However, among those who explored the promoted features, there was no difference in the adoption rate of those who viewed a social versus a non-social announcement. In a follow up survey, we confirmed that the social announcements raised viewer’s awareness of available security features.
Teaching Johnny not to Fall for Phish, at APWG CeCOS 2009Jason Hong
An overview of our group's work on teaching people not to fall for phishing attacks, using simulated phish. The summary is that simulated phish work surprisingly well, in terms of learning and retention.
Introduction to User Experience and User Interface Design: A One-Hour Crash C...Jason Hong
A one-hour crash course on UX design and User Interface Design. I talk about methods for understanding users (contextual inquiry, diary studies, bodystorming), basic design principles (layout, color, mental models, grid), rapid prototyping (building user interfaces quickly, paper prototypes), and evaluation (heuristic evaluation).
User-Controllable Security and Privacy for Pervasive Computing, at Hotmobile...Jason Hong
We describe our current work in developing novel mechanisms for managing security and privacy in pervasive computing environments. More specifically, we have developed and evaluated three different applications, including a contextual instant messenger, a people finder application, and a phone-based application for access control. We also draw out some themes we have learned thus far for user-controllable security and privacy.
L’Amministratore Delegato Flavio Cattaneo ha illustrato i risultati dei primi nove mesi e del terzo trimestre 2010, esaminati e approvati dal Consiglio di amministrazione di TERNA SpA (“Terna”), riunitosi oggi sotto la presidenza di Luigi Roth.
Soave, riconoscimenti del presidente della Repubblica e del Senato per il pellegrinaggio nazionale dedicato ai caduti. Commemorazione a 70 anni dalla tragica battaglia che si combatté nel gennaio ‘43 e permise la ritirata dalla Russia. Il monumento riproduce il sottopasso ferroviario di Nikolajewka, punto d’inizio della storica ritirata. Presenti tra gli altri il sottosegretario alla difesa, Gianluigi Magri, la senatrice Cinzia Bonfrisco, il parlamentare Alberto Giorgetti, l’assessore regionale Massimo Giorgetti e il presidente della Provincia, Giovanni Miozzi.
Software runs our world — the cars we drive, the phones we use, the websites we browse, the entertainment we consume. In every instance privacy risks abound. How do software development teams design and build software to ensure privacy data is protected?
Attend this webcast to learn practical tips to build software applications that protect privacy data. Understand the requirements of new laws such as GDPR and the impact they have on software development.
Topics covered:
• Designing for Privacy: least privilege and compartmentalization
• Creating privacy impact rating
• Implementing application privacy controls
• Techniques for effective privacy testing
Data science and visualization lab presentationiHub Research
The Data Science and Visualization Lab! This product is based on a component of research that delves into and innovates on the processes of data science – collection, storage/management, analysis and visualization. You have probably come across one of our amazing info-graphics. What else can you do with data?
A management introduction to IoT - Myths - Pitfalls - ChallengesSven Beauprez
An introduction of what The Internet of Things is based on an overview of our society, how an implementation of The Internet of Things looks like from a bird eye view and some pitfalls and challenges that come with IoT.
This presentation was given on several occasions to C-Level management, lawyers, students, techies,...
User-Controllable Security and Privacy for Pervasive Computing, at Hotmobile...Jason Hong
We describe our current work in developing novel mechanisms for managing security and privacy in pervasive computing environments. More specifically, we have developed and evaluated three different applications, including a contextual instant messenger, a people finder application, and a phone-based application for access control. We also draw out some themes we have learned thus far for user-controllable security and privacy.
L’Amministratore Delegato Flavio Cattaneo ha illustrato i risultati dei primi nove mesi e del terzo trimestre 2010, esaminati e approvati dal Consiglio di amministrazione di TERNA SpA (“Terna”), riunitosi oggi sotto la presidenza di Luigi Roth.
Soave, riconoscimenti del presidente della Repubblica e del Senato per il pellegrinaggio nazionale dedicato ai caduti. Commemorazione a 70 anni dalla tragica battaglia che si combatté nel gennaio ‘43 e permise la ritirata dalla Russia. Il monumento riproduce il sottopasso ferroviario di Nikolajewka, punto d’inizio della storica ritirata. Presenti tra gli altri il sottosegretario alla difesa, Gianluigi Magri, la senatrice Cinzia Bonfrisco, il parlamentare Alberto Giorgetti, l’assessore regionale Massimo Giorgetti e il presidente della Provincia, Giovanni Miozzi.
Software runs our world — the cars we drive, the phones we use, the websites we browse, the entertainment we consume. In every instance privacy risks abound. How do software development teams design and build software to ensure privacy data is protected?
Attend this webcast to learn practical tips to build software applications that protect privacy data. Understand the requirements of new laws such as GDPR and the impact they have on software development.
Topics covered:
• Designing for Privacy: least privilege and compartmentalization
• Creating privacy impact rating
• Implementing application privacy controls
• Techniques for effective privacy testing
Data science and visualization lab presentationiHub Research
The Data Science and Visualization Lab! This product is based on a component of research that delves into and innovates on the processes of data science – collection, storage/management, analysis and visualization. You have probably come across one of our amazing info-graphics. What else can you do with data?
A management introduction to IoT - Myths - Pitfalls - ChallengesSven Beauprez
An introduction of what The Internet of Things is based on an overview of our society, how an implementation of The Internet of Things looks like from a bird eye view and some pitfalls and challenges that come with IoT.
This presentation was given on several occasions to C-Level management, lawyers, students, techies,...
Workshop session given at the Institutional Web Management Workshop 2012 (IWMW 2012) event held at the University of Edinburgh on 18th - 20th June 2012.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Securing your Kubernetes cluster_ a step-by-step guide to success !
Context Fabric: Privacy Support for Ubiquitous Computing
1. Context Fabric:
Privacy Support for
Ubiquitous Computing
Jason I. Hong
Group for
UserInterface
Research
University of
California
Berkeley
2. Apr 24 2003 2
Ubiquitous Computing Scenario
Diversity of devices
Mobile and embedded
Many kinds of interactions
Many kinds of sensors
All networked together
3. Apr 24 2003 3
Privacy and Ubicomp
• Tension: information can be used for great benefit
and great harm
• Privacy is the most often-cited criticism of
Ubicomp
– “The Boss That Never Blinks” [San Jose Mercury
News 1992]
• What is new here is the scale of Ubicomp
– Past: costly to collect, store, and use info
– Future: everywhere, always on, far easier to collect
data
4. Apr 24 2003 4
Problem
• Hard to create privacy-aware Ubicomp systems
– Hard to analyze privacy
• What should the privacy goals be?
• Which system interactions should we focus on?
– Hard to implement privacy-aware systems
• What are the basic abstractions?
• What are the privacy mechanisms?
5. Apr 24 2003 5
Solution Overview
• Approximate Information Flows
– Framework for analyzing privacy in terms of info flow
– Minimize flow out of sensitive data, maximize flow back
in of how that data is used
• Context Fabric, infrastructure for privacy-aware
apps
– InfoSpaces, repositories of personal data
– Operators, reusable mechanisms for managing info
flow
• Evaluation through building apps
– Person Finder & Building Emergency Response
6. Apr 24 2003 6
TalkOutline
Motivation
Privacy and Managing Information Flows
Architectural Overview of Context Fabric
Applications Built with Context Fabric
Conclusions
7. Apr 24 2003 7
Defining Information Privacy
• Different kinds of privacy
– Territorial, Bodily, Communications, Info rm atio n
• Information Privacy conflates many issues
– Security, Confidentiality, Anonymity
• Defining Information Privacy [Westin 1967]
– “Privacy is the claim of individuals, groups or institutions
to determine forthemselves when, how, and to what
extent information about them is communicated to
others”
• My work is on providing end-users with
greater control and understanding
8. Apr 24 2003 8
Privacy & Managing Information
Flows
• Control & understanding hard due to how info flows
• Examples
– Collecting info without person knowing
– Sharing (or selling) info without person knowing
• Design Goal: Manage information flows by
– Minimizing flow of outgoing sensitive data (control)
– Maximizing flow of incoming data about use
(understanding)
Service
Providers
You Out
In
Out
In
9. Apr 24 2003 9
Example of Managing Information
Flow
• First time touring France
• PDA, GPS, maps, wireless
Alice, a Tourist Carol, a Tour Operator
• Lines at Museums
• Current Events
• Recommendations
• Route Finder
• Sets up tour packages
• Wants demographics
• Wants places visited
Bob, Provides Real-time Tourist Info
10. Apr 24 2003 10
Example of Managing Information
Flow
Alice, a Tourist
• Reads a review
• Finds Bob's website
• Skims privacy policy
• Decides to try
Bob, Provides Real-time Tourist Info
11. Apr 24 2003 11
Example of Managing Information
Flow
Alice, a Tourist
• Basic service
• Demographics + City
• Events, museum lines
• Gold service
• Demographics + GPS
• Recs, route finder
• Will sell aggregated data
Bob, Provides Real-time Tourist Info
12. Apr 24 2003 12
Example of Managing Information
Flow
Alice, a Tourist
• Opts for Basic Service
• Logs outgoing data
Bob, Provides Real-time Tourist Info
13. Apr 24 2003 13
Example of Managing Information
Flow
• Lower Precision
• Aggregate
• Garbage Collect
• Log outgoing
Carol, a Tour Operator
Bob, Provides Real-time Tourist Info
14. Apr 24 2003 14
Approximate Information Flows
• Approximate Information Flows is a framework
for analyzing information flows in Ubicomp
systems
• Two questions:
– When does data flow to others?
– Strategies for protecting that data?
15. Apr 24 2003 15
When Does Data Flow to Others
• Co lle ctio n, when data is gathered
– Ex. When Alice gets her location data (ex. GPS)
• Acce ss, when data is first requested or provided
– Ex. Alice sends her location data to Bob
• Se co nd use , sharing data after access
– Ex. Bob shares data with Carol
16. Apr 24 2003 16
Strategies forProtecting Data
• Pre ve nt privacy violations from occurring
– Ex. Refuse request, Turn off device
– Minimizing flow out of sensitive data
• Avo id potential privacy risks
– Ex. Lowering precision, Notification
– Minimizing flow out & maximizing flow in
• De te ct any privacy violations
– Ex. Internal and Third party audits of Bob and Carol
– Maximizing flow in about how data is used
• I am focusing on Avoidance & Detection
17. Apr 24 2003 17
Technical & Non-Technical Solutions
• Privacy cannot be managed by Technology alone
• Appropriate Technology can make it easierfor
otherforces to act
Privacy
Social
Market Legal
Technology
Lessig, “Architecture
of Privacy”
18. Apr 24 2003 18
TalkOutline
Motivation
Privacy and Managing Information Flows
Architectural Overview of Context Fabric
Applications Built with Context Fabric
Conclusions
19. Apr 24 2003 19
Assumptions
• Pessimistic case: Designers and service
providers don't care or are trying to violate users'
privacy
• My work is on Optimistic case: Designers and
service providers trying to deploy privacy-aware
apps
– Minimize privacy risks (perceived and real) for their
users
• Market, Social, Legal Forces support optimistic
case
– Market: Toysrus.com
–
20. Apr 24 2003 20
Building Privacy-aware Apps Today
• P3P (Platform for Privacy Preferences)
– Focuses on communicating policy and obtaining
consent
• Privacy Mirrors
– GUIs for helping people understand how system is
tracking
– No control over how information flows or how to build
• Cricket Location Beacons
– Does not deal with sharing of information
• Ubicomp infrastructures [ParcTab system, Context
Toolkit]
– No support for privacy or end-user control
21. Apr 24 2003 21
Architectural Requirements
• Easy to create privacy-aware Ubicomp apps
• Low barrier to entry
– Make it simple for programmers, admin, end-users
• Easy to add or modify app-specific privacy
controls
• Easy for end-users to control and understand
• Easy to share info at level users comfortable with
22. Apr 24 2003 22
High-Level Architecture
Bob's
Informatio
n
Bob's
Informatio
n
Alice's
Information
Alice's
Information
GPS
Loc
(GPS)
Tourguide
App
Bob's
Service
Operators
23. Apr 24 2003 23
Bob's
Informatio
n
Bob's
Informatio
n
Alice's
Information
Alice's
Information
Loc
(GPS)
High-Level Architecture
GPS
Operators
Tourguide
App
Bob's
Service
Loc
(City)
Loc
(City)
Loc
(GPS)
24. Apr 24 2003 24
Alice's
Information
Alice's
Information
Bob's
Informatio
n
Bob's
Informatio
n
High-Level Architecture
GPS
Tuple
Operators
Tourguide
App
Bob's
Service
Loc
(GPS)
Events
25. Apr 24 2003 25
InfoSpaces
• Key abstraction is the InfoSpace
– Represents data about a single entity
– Like an object with dynamic properties
– Decentralize data, put in user’s hands
– Implemented as a TupleSpace
• InfoSpaces contain Tuples
– Represents single piece of data
– Sensors & Apps can add or query
Tuples
– Default value is UNKNOWN
– Tuples can point to other InfoSpaces
Alice’s
InfoSpace
Alice’s
InfoSpace
Loc Activity
Name
Room
525’s
InfoSpace
Room
525’s
InfoSpaceTemp Sound
Level
26. Apr 24 2003 26
Tuples
• Metadata
– Data type (ex. "location")
– Data format (ex. "edu.berkeley.soda.room")
• Values
– Value (ex. "525" with 88% confidence)
– Link to InfoSpace (ex.
"http://guir.berkeley.edu/rooms/525/")
• Privacy Tag
– When this Tuple should be garbage collected
– Who to notify on second use
27. Apr 24 2003 27
Operators
• Pieces of chainable code for manipulating
Tuples
– Designed for reusability and extensibility
28. Apr 24 2003 28
Operators
• Pieces of chainable code for manipulating
Tuples
– Designed for reusability and extensibility
• In-Operators modify incoming tuples
– Ex. Check that we are only receiving data we are
allowed to see ("please don't pass on to other
people")
In Operators
Source
Alice’s
InfoSpace
Alice’s
InfoSpace
TupleTupleTuple
TupleTupleTuple
29. Apr 24 2003 29
Operators
• Pieces of chainable code for manipulating
Tuples
– Designed for reusability and extensibility
• In-Operators modify incoming tuples
• Out-Operators modify outgoing tuples
– Ex. Lowering precision of data
Sink
Alice’s
InfoSpace
Alice’s
InfoSpace
TupleTupleTuple
TupleTupleTuple
Out Operators
30. Apr 24 2003 30
Operators
• Pieces of chainable code for manipulating tuples
– Designed for reusability and extensibility
• In-Operators modify incoming tuples
• Out-Operators modify outgoing tuples
• On-Operators run periodically on tuples in
InfoSpace
– Ex. Garbage Collection
On OperatorsAlice’s
InfoSpace
Alice’s
InfoSpace
TupleTupleTuple
TupleTupleTuple
31. Apr 24 2003 31
Suite of Privacy Techniques
• Privacy Techniques for Managing Info Flows
– Lowering Precision
– Access Control
– Logging and Periodic Reports
– Privacy Tags
– Garbage Collection
• All implemented as in-, out-, or on-operators
32. Apr 24 2003 32
• Problem: Some tuples provide too
much info
• Solution: Lower precision of data
– Minimize flow of outgoing data by
reducing quality
• Tourist Example
– "Alice is at 56°N 36°E" => "Alice is in
Paris"
– Implemented as out-operator using a
region lookup
Privacy Technique
LoweringPrecision
Paris
Marseilles
56°N 36°E
33. Apr 24 2003 33
Privacy Technique
Access Control
• Problem: Want to provide different info in different
situations
• Tourist Example
– Let Bob see my location at city level only
• Emergency Response Example
– Let firefighters see my room location when I am in Soda
Hall
• Other Examples
– Let all people in Soda Hall see my location at floor level
– Let co-workers see my location if between 9AM and 5PM
34. Apr 24 2003 34
Privacy Technique
Access Control
• Solution: Fine-grained control through Conditions
• Conditions
– Age of data – Data Format
– Requestor Domain – Data Type
– Requestor ID – Current Time
– Requestor Location
• Actions
– Lower Precision – Allow
– Set (fake value) – Hide (data is removed)
– Invisible(no out data) – Timeout (fake network load)
– Interactive – Deny (forbidden)
35. Apr 24 2003 35
• Problem: Need better understanding of who
knows what about you for auditing purposes
• Solution: Logs and periodic reports
Privacy Technique
LoggingandPeriodic Reports
36. Apr 24 2003 36
• US Federal Trade Commission recently
established National Do-not-call list for
telemarketers
James Haverly 404-333-3456
Jason Hong 510-345-3456
Tommy Horn 212-567-8910
• Question: What guarantees do we have that
telemarketers will not use this list to spam
people?
• Answer: Seed with fake data, monitor phone
calls to fake entries, punish violators (detection)
Privacy Technique
LoggingandPeriodic Reports
37. Apr 24 2003 37
Privacy Technique
SeedingFakeDataandPeriodic Reports
Alice, a Tourist
Bob, Provides Real-time Info
• Alice, EPIC, or Consumer
Reports sends fake data
• Checks they are receiving
periodic reports properly
• Also monitors for spam
and other misuses
38. Apr 24 2003 38
Privacy Technique
SeedingFakeDataandPeriodic Reports
Carol, a Tour Operator
Bob, Provides Real-time Info
• Wants assurances
Carol won't abuse data
• Notifies Carol
• Create fake people
with email addresses
• Monitors results for
abuses
39. Apr 24 2003 39
• Problem: Need a way of controlling data after it
has left one’s InfoSpace (second use)
• Solution: Tag each tuple with usage preferences
• Email Analogy
– “Please don't forward this to anyone else”
– “Please delete this in three days”
• Example Privacy Tags
– For Bob and only Bob
– Garbage collect if data over week old
– Garbage collect if user leaves Soda Hall
– Who to notify on violations (Ex.
jasonh@cs.berkeley.edu)
Privacy Technique
PrivacyTags
40. Apr 24 2003 40
Privacy Technique
PeerEnforcement of PrivacyTags
Alice’s
InfoSpace
Alice’s
InfoSpace
Bob’s
InfoSpace
Bob’s
InfoSpace
Carol’s
InfoSpace
Carol’s
InfoSpace
Loc
Loc
LocBob has data
he shouldn't
PTag
PTag
PTag
Delete in 7 days
42. Apr 24 2003 42
Architectural Details
• InfoSpaces
– Leverages web servers, HTTP
– Ex. http://www.cs.berkeley.edu/~jasonh/infospace
– Push data to edge, to where end-users have control
– Low barrierto entry foradmin, programmers, & end-
users
• Operators
– Separate functionality into composable components
– Easy forprogrammers & end-users to add ormodify
• Tuples
– Uses XML docs vs. mobile objects or RPC (hidden state)
43. Apr 24 2003 43
Implementation Details
• All written in Java 1.4
– http://sourceforge.net/projects/confab
– 410 Source classes
– 18500 Lines of Code
• Uses Apache Tomcat web server
• Other infrastructure has been built using my
APIs
– Liquid Distributed Querying package
– C++ Confab-lite PDA version (subset of full version)
44. Apr 24 2003 44
TalkOutline
Motivation
Privacy and Managing Information Flows
Architectural Overview of Context Fabric
Applications Built with Context Fabric
Conclusions
45. Apr 24 2003 45
Applications
• Person Finder
– Like AT&T m-life's Find Friends app but for web
• Building Emergency Response Service
– Based on field studies with firefighters
• Currently using Wizard of Oz simulations for data
rather than actual sensors
– Make sure on right track before devoting time to
sensors
– Sensors being deployed in Berkeley now
46. Apr 24 2003 46
Building Emergency Response Service
• One of a suite of applications we are
developing
• Keep track of people in a building
– Help building managers check if a building is clear in
the event of an evacuation
– Help firefighters understand where people are
• Also provide reasonable privacy protection
– People don't like to be tracked
– Emergency situations relatively rare
– Ensuring that data is used properly
47. Apr 24 2003 47
Field Study and Iterative Design
with Firefighters
• What are big problems sensors can help with?
– Four month field study, 30+ hours
– Iterative prototyping and evaluation with firefighters
– Gives the tools we build a better chance of
succeedingFirefighters said knowing where
people were in building would
help determine theirstrategy
48. Apr 24 2003 48
Location
Beacons
Building Emergency Response Service
SoftwarePrototype1
Alice’s
InfoSpace
Alice’s
InfoSpace
Emergency
Response
InfoSpace
Emergency
Response
InfoSpace
Building
General
Use
InfoSpace
Building
General
Use
InfoSpace
Alice’s Personal Info
General
Purpose Apps
Emergency
Response Apps
49. Apr 24 2003 49
Location
Beacons
Building Emergency Response Service
SoftwarePrototype1
Alice’s
InfoSpace
Alice’s
InfoSpace
Emergency
Response
InfoSpace
Emergency
Response
InfoSpace
Alice
(Room)
Building
General
Use
InfoSpace
Building
General
Use
InfoSpace
50. Apr 24 2003 50
Location
Beacons
Building Emergency Response Service
SoftwarePrototype1
Alice’s
InfoSpace
Alice’s
InfoSpace
Emergency
Response
InfoSpace
Emergency
Response
InfoSpace
Alice
(Room)
Building
General
Use
InfoSpace
Building
General
Use
InfoSpace
Out
51. Apr 24 2003 51
Location
Beacons
Building Emergency Response Service
SoftwarePrototype1
Alice’s
InfoSpace
Alice’s
InfoSpace
Emergency
Response
InfoSpace
Emergency
Response
InfoSpace
Building
General
Use
InfoSpace
Building
General
Use
InfoSpace
Out
Alice
(Floor)
Alice
(Room)
Alice
(Room)
Notify
52. Apr 24 2003 52
Location
Beacons
Building Emergency Response Service
SoftwarePrototype1
Alice’s
InfoSpace
Alice’s
InfoSpace
Emergency
Response
InfoSpace
Emergency
Response
InfoSpace
Building
General
Use
InfoSpace
Building
General
Use
InfoSpace
Out
Alice
(Room)
Alice
(Room)
Alice
(Floor)
Person
(Room)
53. Apr 24 2003 53
Location
Beacons
Building Emergency Response Service
SoftwarePrototype1
Alice’s
InfoSpace
Alice’s
InfoSpace
Emergency
Response
InfoSpace
Emergency
Response
InfoSpace
Building
General
Use
InfoSpace
Building
General
Use
InfoSpace
Alice
(Room)
Alice
(Floor)
Person
(Room)
Person
(Room)
54. Apr 24 2003 54
Architecture Evaluation Plan
• First iteration of infrastructure done
– Two prototype apps with simulated data
– C++ Confab-lite for PDAs (3 weeks by 1 person)
– Liquid Distributed Querying (4 weeks by 3 people)
• Low barrier to entry
– Person Finder and Emergency Response ~ 1 week each
– Further evaluation by having others create simple apps
• Easy to add or modify app-specific privacy controls
– Each Privacy Operator ~ 5 days each
• Easy for end-users to control and understand
Easy to share info at level users comfortable with
– Running preliminary user studies
55. Apr 24 2003 55
Long-termEvaluation of Privacy
• How effective are the privacy techniques?
– Requires long-term deployment of sensors and
apps
• Surveys since 1990s on privacy have shown
three basic groups [Westin]
• Risk / benefit sweet spot?
– Privacy for Safety
– Privacy for Convenience
• My work makes this possible
Unconcerned
12%
Pragmatists
63%
Fundamentalists
25%
56. Apr 24 2003 56
Contributions
• Framework: Approximate Information Flows
– Minimize flow out, maximize flow in
• Architecture: Context Fabric, infrastructure for
privacy-aware apps
– Privacy Operators for controlling information flow
– Privacy Tags, for limiting dissemination of information
– Peer enforcement for privacy
• Evaluation
– Applications: Person Finder and Emergency Response
– Ease of creating apps by others, in progress
57. Apr 24 2003 57
Future Work
Iterative Designof UbicompApplications
• Iterative design is the best practice for creating
UIs
• Getting it right the first time is hard
• Lots of experience in iterative design for the Web
that can apply to Ubicomp
Design
Prototype
Evaluate
58. Apr 24 2003 58
Future Work
Designof UbicompApplications
• Co-authored Book on
Web Design Patterns
– Used in several classes
– E-commerce sites,
Shopping Carts, Action
Buttons
• What are Design Patterns
for Ubicomp?
– Context-sensitive I/O
– …
• Which existing GUI/Web
patterns apply to Ubicomp?
• Can patterns improve the
speed with which we can
build Ubicomp apps?
59. Apr 24 2003 59
Future Work
Prototypingof UbicompApplications
• Developed SATIN toolkit
– Infrastructure for sketching
apps
– Ink, interpretation, & zooming
– Downloaded over 1200 times
• Georgia Tech, PARC, NRL,
UCB
• What are the
infrastructure needs of
Ubicomp apps?
– Privacy
– Scalability
• How do you simulate
services that are not yet
ubiquitous?
• What types of higher level
inference services
needed?
UIST 2000
60. Apr 24 2003 60
Future Work
Prototypingof UbicompApplications
• Co-developed DENIM
– Informal Web prototyping tool
– Downloaded over 13,000
times
• What do rapid prototyping
tools for Ubicomp apps look
like?
– Sketch-based
– Multimodal
– Wizard of Oz
– Programming by
Demonstration
CHI 2000
61. Apr 24 2003 61
Future Work
Evaluationof UbicompApplications
• Started WebQuilt Project
– Remote Web site usability
testing & analysis tool
– Downloaded over 600 times
• How can we evaluate
Ubicomp apps?
• What are new
methodologies & tools?
• Ubicomp apps often
mobile, so remote
evaluation tools may work
well!
WWW10
62. Apr 24 2003 62
Conclusions
• Approximate Information Flows Framework for
analyzing privacy in ubicomp systems
• Context Fabric architecture for privacy-aware
apps
• Evaluation with two applications
• Privacy just one aspect of Ubicomp
– Future Work lies in better tools and methods
needed to Design, Prototype, and Evaluate
Ubicomp
• Ubicomp is coming
– Let’s guide it in the right directions
65. Apr 24 2003 65
Verifying Trust Online
• Seals of Approval, Branding
• Audits
• Consumer Reports, Epinions for ratings
• Open Source code that is visible to all
• Web of Trust
– A trusts B, B trusts C, A has a reason to trust C
• Extend research on Web site credibility to
Ubicomp
66. Apr 24 2003 66
Incentives forService Providers
• Market, Social, and Legal forces
– Lower costs for “good guys”, raise costs for “bad guys”
– Make it harder for “bad guys”, more money by being “good”
• We are in the early phases of Ubicomp
– We can help set people’s expectations high
– Make it easy for service providers to do, no excuses
• Bias the system towards privacy
– Even without privacy operators, still pushes data to edge
• Educate future designers and engineers
• Tipping Point for privacy?
– How much buy-in before peer enforcement really effective?
67. Apr 24 2003 67
Categorizing Privacy Techniques
Strategies for
Protecting
Data
Data Lifecycle
Avoid
Prevent
Collection Second UseAccess
Detect
Anonymization
Pseudonymization
P3P
Access
Control
Location
Support
Privacy Mirrors
Wearables
User Interfaces for Feedback,
Notification, and Consent
Logging and Periodic Reports
Audits
Privacy
Tags
Garbage Collection
Lowering
Precision
69. Apr 24 2003 69
Personal Insights
• When starting, I focused on pre ve ntio n
– Digital Rights Management, Encryption, Mobile Code
– But prevention only goes so far
• Now, focused more on avo idance and de te ctio n
– Increase transparency of Ubicomp systems
– "Trust but verify"
AliceAlice BobBob
71. Apr 24 2003 71
Yoda
Beware the Dark Side
of Ubicomp you must!
Jedi Master
Kickass Dude
72. Apr 24 2003 72
Context Data Model
Divisionof Responsibilities
InfoSpace
Server
InfoSpace
Tuple
Analogous to web servers
Manages a collection of InfoSpaces
Unit of administration
Unit of deployment
Analogous to a web site /homepage
Represents context data about an entity
Represents zone of protection
Manages collection of context tuples
Unit of ownership and addressing
Analogous to individual web page
Represents single piece of context data
Contains privacy preferences and metadata
Unit of storage
73. Apr 24 2003 73
Architecture Recap
• InfoSpaces represent entities and contain Tuples
• Operators modify flow of Tuples, primarily for
privacy
– Lowering Precision
– Access Control
– Logging and Periodic Reports
– Privacy Tags
– Garbage Collection
Sink
Alice’s
InfoSpace
Alice’s
InfoSpace
TupleTupleTuple
Out Operators
74. Apr 24 2003 74
Thinking about Privacy and Ubicomp
• “The problem, while often couched in terms of
privacy, is really one of control. If the
computational system is invisible as well as
extensive, it becomes hard to know:
– what is controlling what
– what is connected to what
– where information is flowing
– how it is being used
– what is broken
– what are the consequences of any given action
(including simply walking into a room)”
[Weiser 1999]
75. Apr 24 2003 75
Context Data Model
InfoSpaces
• TupleSpace meets Web
• TupleSpace
– A shared data space
– add(), remove(), query(), subscribe(), unsubscribe()
– Complexity shifted into data model and query language
• Web
– Leverages existing technology (ex. firewalls)
– Leverages well-understood models for administration,
deployment, authoring, and programming
– End-user mental model
– Independent deployment & anarchic scalability
76. Apr 24 2003 76
Quotes on Privacy
• “You know it when you lose it”
• “My own hunch is that Big Brother, if he comes to
the United States, will turn out to be not a greedy
power-seeker but a relentless bureaucrat
obsessed with efficiency” [Vance Packard]
• Privacy relatively new concept in society,
“ultimately a psychological construct, with
malleable ties to specific objective conditions”
[Grudin 2001]
77. Apr 24 2003 77
Backup Slides
Personal
Perspectives
78. Apr 24 2003 78
My Personal Perspectives on Privacy
PrivacywillbeaContinuous Struggle
• Privacy will never be “solved”
– Ongoing struggle about relation of individual and
society
• Old issues are now manageable risks
– Photography
– Telephone
• New issues will arise
– Digital rights management
– Genetic databases and genetic profiling
– Detecting AIDS by shaking someone’s hands
• But fundamentals are still the same
– Flow of personal information
79. Apr 24 2003 79
My Personal Perspectives on Privacy
OurRoleas Researchers
• Core problem is rate of change
– Social, legal, and market can’t adjust fast enough
• Our job as researchers should be:
– Identifying new privacy risks
– Figuring out better legal, social, market, and technical
approaches for managing these risks
• Better architectures and UIs
– Better education – we’re the ones that build these!
• Professionalization of engineers
80. Apr 24 2003 80
My Personal Perspectives on Privacy
TerrorismandPrivacy
• Do it if:
– It does not unduly treat average people as suspects
– Benefits far outweigh costs
• Total Information Awareness fails b/c of questionable
technology
– Ensure Transparency, Accountability, and Oversight
• Focus on things that have multiple benefits
– Ex. Monitoring of national medical system
• Let’s not forget other issues too
– ~43000 people die every year due to car accidents
– $1 billion plus damage due to fire every year
82. Apr 24 2003 82
Why Privacy?
Idealistic Reasons
• UN Universal Declaration of Human Rights Article 12
– "No one shall be subjected to arbitrary interference with his
privacy, family, home or correspondence, nor to attacks
upon his honour and reputation. Everyone has the right to
the protection of the law against such interference or
attacks."
• Old Hippocratic Oath
– "What I may see or hear in the course of the treatment or
even outside of the treatment in regard to the life of men,
which on no account one must spread abroad, I will keep
to myself, holding such things shameful to be spoken
about."
83. Apr 24 2003 83
Why Privacy?
Pragmatic Reasons
• Identity theft
• Stalking
• Excessive monitoring
• Data for one purpose tends to be used for others
– Ex. SSN
• The Trackable Society [Kling]
– Stronger enforcement of laws that cannot be done
today
– Ex. Speeding, but better tech could enforce it in
theory
84. Apr 24 2003 84
Why Privacy?
Cannot Always Reject Technology
• Oakland nurses successfully rejected active
badges
• Stakeholders
– Admin wanted it for efficiency and accountability
– People at desk liked it to find people
– Nurses hated it because no benefit to them
• However, nurses could reject only because they
had economic upper hand, ie a shortage of
nurses
• We build these systems, we have a
85. Apr 24 2003 85
Why Privacy?
PrivacyandTechnology, GaryMarx
• Anonymity important for encouraging honesty
and risk-taking
• Confidentiality can improve communication
flows
– Doctors, lawyers
• Resource in inter-personal relations
• American ideal of starting over
• Some information can be used unfairly
– Ex. Religious discrimination
• Mental health and creativity
• Totalitarian systems lack respect for individuals
86. Apr 24 2003 86
Medical Record Risks
1997NationalResearchCouncilReport
• Insiders who make innocent mistakes and
cause accidental disclosure of confidential
information
• Insiders who abuse their record access
privileges
• Insiders who knowingly access information for
spite or for profit
• An unauthorized physical intruder who gains
access to information
• Vengeful employees and outsiders
87. Apr 24 2003 87
Backup Slides
Arguments
Against Privacy
88. Apr 24 2003 88
Arguments Against Privacy
• “I have nothing to hide”
• Transparent Society
• Communitarian argument
• We’ll adapt
89. Apr 24 2003 89
Arguments Against Privacy
“Ihavenothingtohide”
• So why close the door when changing clothes?
• Real issue is civil rights and human dignity
– Surveillance gives impression that activity is not
proper
– Surveillance can be a pervasive form of repression
– Privacy also protects us from excessive norms
[Goffman]
– Empower people to choose what is disclosed and
when
90. Apr 24 2003 90
Arguments Against Privacy
TheTransparent Society, byDavidBrin
• Openness and accountability
are key to a democratic society
– The technology is coming,
– Let’s opt for complete
transparency
• “In all of human history, no
government has ever known
more about its people than our
government knows about us.
And in all of human history, no
people have ever been
anywhere near as free.”
• Quis Custodiet Ipsos
Custodes?
91. Apr 24 2003 91
Arguments Against Privacy
TheLimits of Privacy, byAmitaiEtzioni
• Communitarian argument
• Ex. Public safety and Health
– HIV testing for newborns
• Ex. Megan’s laws
• Communities and Ubicomp
– Can modify flow of info to match
community and individual needs
– But, not enough experience with
communities & Ubicomp to
judge
– Strong potential for abuse in
Ubicomp, let’s be conservative
92. Apr 24 2003 92
Arguments Against Privacy
We’lladapt
• Warren and Brandeis’ quote, that privacy is “the
right to be left alone”, was about photography (!)
• “One common complaint… was that the telephone
permitted intrusion… by solicitors, purveyors of
inferior music, eavesdropping operators, and even
wire-transmitted germs. Messages come
unbidden; background sounds reveal intimacies of
the home to the caller…” [Fischer 1994]
93. Apr 24 2003 93
Arguments Against Privacy
We’lladapt
• Credit card slips in restaurants easily stolen
• Cell Phones provide rough location
• Credit card data charges limited to $50
• Let’s take it slow and be conservative here, can
remove privacy over time, can’t put it back in
95. Apr 24 2003 95
Related Work
P3P
• Platform for Privacy Preferences Project
– Standard machine-readable format for defining
privacy practices on web sites
– Designed to be integrated into Web Browsers
• Orthogonal to Confab
– Confab focuses on several techniques for privacy
– P3P could be integrated as one of them
• Related Work
– Langheinrich looking at integrating P3P into Ubicomp
96. Apr 24 2003 96
Related Work
ResearchonLying(1/2)
• Lies are pretty common [DePaulo and Kashy 1996]
– 77 university students and 70 community members
– 1-2 lies daily, 1 in 3 interactions (students)
– Quality of relationships with same gender => fewer
lies
– Kind of lie related to gender
• Self-centered lies told to men
• Other-centered lies told to women
– Socially adroit people told more lies
– Easier to lie when not face-to-face
97. Apr 24 2003 97
Related Work
ResearchonLying(2/2)
• Cultural differences in lying [Aune and Waters 1994]
– Collectivist (Samoan) vs individualist (USA) societies
– Collectivists more likely to attempt to deceive when
related to group / family or authority-based concerns
– Individualists more likely to lie to protect privacy or the
feelings of the target person
98. Apr 24 2003 98
Related Work
FairInformationPractices
• Notice - Notice of data collection
• Choice - Consent over collection
• Onward Transfer - Consent over secondary
use
• Access - See data about self
• Security - Reasonable safeguards
• Data Integrity - Data is accurate
• Enforcement - Enforcing policies and redress
99. Apr 24 2003 99
Related Work
OECDFairInformationPractices
• Collection Limitation - Limited collection with
consent
• Data Quality - Relevant and up-to-date
• Purpose Specification- Purpose at time of collection
• Use Limitation - Restrict use to said purposes
• Security Safeguards - Reasonable security
• Openness Principle - Existence of data known
• Individual Participation - Obtain and correct the
data
• Accountability - Someone accountable
100. Apr 24 2003 100
Related Work
CommentaryonFairInformationPractices
• FIPs meant for governments and corporations
– Need a framework that also deals with individuals
– Also wide range of trust, from family to friends to co-workers
• Spectrum of apps require different kinds of practices
– Commercial apps vs. Firefighter apps vs. National Security apps
– App running at home vs. App running at work
• Notification and Consent impractical in some cases
– Cannot always readily notify (ex. traffic monitoring)
– Possibly no alternatives (cannot opt out of building security
cameras)
– Pervasive sensors significantly increases scale
• Need a framework that considers:
– Risks / Benefits, Identifiability, Quality, Quantity, and Scope of
data
101. Apr 24 2003 101
Related Work
Smart Dust /TinyOS/TinyDB
• Small, reconfigurable wireless sensors
• Strongly driven from systems perspective
– Power management
– Dead nodes
– Continuous and adaptive query processing
• This work starts from a human-centered
perspective
– Privacy concerns
– Cost / benefit of ubicomp
– Providing people control over their data
102. Apr 24 2003 102
Related Work
Context Toolkit
• Uniform abstraction for sensors
– GPS, beacons, Active Badges map to Location widget
– Interpreters for transforming data
• Confab has different focus
– Data model and data management rather than
sensors
– Privacy focus
103. Apr 24 2003 103
Related Work
HPCoolTown
• Web presence for people, places, things
– Associate dynamically updated web pages with
entities
– Beacon out or scan in URLs
• Confab
– Focuses on context for machines vs for people
– Focuses on privacy issues
– Can leverage technologies for transmitting URLs
104. Apr 24 2003 104
Related Work
ParcTabSystem
• Confab is an evolution of ParcTab system
– Splits ParcTab Dynamic Environments into multiple
independent InfoSpaces
• Tries to push data to the edge, to end-users
– Confab focuses on privacy and data manipulation
105. Apr 24 2003 105
Related Work
EventHeap/iRoom
• iRoom uses TupleSpace to coordinate events
for a Smart Room
– Level of indirection provides fault-tolerance and
uniform level of abstraction
• Confab also uses TupleSpace
– For same reasons, simple and uniform API
– Focuses on privacy and sharing of data
106. Apr 24 2003 106
Related Work
Semantic Web/DAML
• Semantic Web
– Markup for computers rather than for people
– Rich (and complex!) language for modeling
(motherOf subProperty parentOf)
(Mary motherOf Bill)
(Mary parentOf Bill)
• Semantic Web has no story for:
– Privacy, ie individuals managing personal data
– Handling sensor data and dynamic updates
• Confab aims for a simpler model
– Complexity of Semantic Web is huge barrier to entry
– Start simple
107. Apr 24 2003 107
Related Work
WebServices
• Three parts to web services
– SOAP, remote procedure call over web
– WSDL, description of service API
– UDDI, registry for web services (like white pages)
• Strength and weakness of web services is
specialized API
– Pro: Lots of semantics, highly tuned for specific app
– Con: No network effects, need new apps for each
service
– Key insight to web is only few methods on lots of
datatypes (like TupleSpace)
108. Apr 24 2003 108
Related Work
GridComputing
• Similar constraints but very different goals
– Scale
– Heterogeneity
– Unpredictable structure
– Multiple administrative domains
• Grid is focused on creating a networked virtual
supercomputer rather than ubicomp services
110. Apr 24 2003 110
Future Work
LiquidDistributedQueries
• Querying across multiple InfoSpaces
– Ex. “Average age of all people in the room”
• average (room.people.age)
• Update as people go in and out
– Ex. “Average temperature for division X”
• average (division.members.temperature)
– Ex. “Monitor water pressure for all companies”
• company.water-pressure
• Update as companies go in and out
• Prototype Liquid already works
– Lots of weird cases that need to be handled, though
111. Apr 24 2003 111
Future Work
QueryPlanning
• Went for simple case first
– Hardwired chain of operators
• Really need query planning here
– Which operators to apply and in what order
– Some weird cases when processing privacy operators
• Should take same approach as databases
– Generate several plans
– Avoid really bad ones
– Execute it
112. Apr 24 2003 112
Future Work
Imperfect MirrorWorlds
• Perfect Mirror Worlds may not be desirable
– Highly reliable, highly connected, continuously
updated
• Reasons
– No room to hide
– No room for ambiguity and deniability
– No freedom of action
• Better metaphors?
– Cell phones or Instant Messenger?
– End-user is in control
• How to build physical layer to support this?
113. Apr 24 2003 113
Future Work
PhysicalLayerPrivacy
• Inspired by Peter Swire’s “TrustWrap”
– Johnson and Johnson built trust into every
transaction.
– Customers use their own senses to reaffirm that the
Tylenol is safe. They touch the plastic wrap, and they
see both the plastic wrap and the foil before they take
a pill.”
• How to build trust into sensors?
– People can see and understand the privacy model
– Ex. Cameras that have translucent plastic in front
– Ex. Simple motion sensors (familiarity)
– Local scope, local and immediate feedback
114. Apr 24 2003 114
Future Work
MatchingSocialExpectations
• Berkeley Sproul Plaza
• Seen by hundreds, but no privacy worries,
why?
– Low identifiability
– Limited scope of data
– Reciprocity
– Forgetfulness
– Expectations
• What kinds of applications does this suggest?
115. Apr 24 2003 115
Future Work
Trust Mechanisms
• Privacy depends on trust that people will do the
right thing
• Need better trust mechanisms
– Web of Trust (“I trust Alice, Alice trusts Bob”)
– Transparency via Audits (“Trust but verify”)
• Also a problem encountered on e-commerce
sites
– Branding
– Well-designed professional sites
– Ease-of-use
• Need this for Ubicomp
116. Apr 24 2003 116
Future Work
Perceptions of PrivacyChangeOverTime
• Perceptions change with experience [Pew Internet]
– Lowest privacy concerns with white males (first
adopters)
– Internet privacy concerns greater among novices,
parents, elderly, and women (middle-late adopters)
– Online experience increases number of trusting
activities and commercial activities (small sample set
though)
• Likely that this will apply to Ubicomp
• However, due to pervasiveness of Ubicomp,
really should be conservative here
– Early failures could cost us a lot here
117. Apr 24 2003 117
Future Work
DesigningContext-AwareSystems
• Minimize automatic actions
– Calculate cost-to-benefit, both statically and dynamically
• Feedback
– What is being captured?
– Why did the system do that?
• Feed-forward
– If you do that, then the system will do this
• Confirmation
– The system just did the following action
• Less identifiability
– Model Places and Things rather than people
– Less identifiable entities, ex. “Some person” rather than “Jason”
• Endpoint
– How much context is for people, how much for computers?
Editor's Notes
People have different intuitions about privacy, in this talk, I’m going to focus on a particular type of privacy.
Point of this slide: Ubicomp can be powerful helping us do incredible things that we simply cannot do today.
I'd like to start by describing a scenario that fleshes out many of the key characteristics of ubiquitous computing. So imagine that we have a "smart building" of the future, with a lot of people entering and exiting, and their location is stored by this building for emergency response purposes. For example, suppose there was a fire and a bunch of people are trapped on the 6th floor and they're running out of oxygen, or there was an earthquake, and it could tell you that three people who were in the basement are missing. This kind of information would really help emergency responders mount rescue operations.
Firefighters arrive, wearing heads-up displays for navigating through the building, thermal imaging to help see through the smoke, proximity networking that helps them stay near their companions and even find other firefighters in case they got lost, and sensors that can help warn them of serious dangers, such as flashovers, a special condition when everything in the room suddenly ignites all at once, putting firefighters at great risk.
Firefighters also have a mobile command post to help coordinate all of the information from the firefighters inside. Also has sensors for tracking what firefighters are there and where they are.
Uses a portable electronic whiteboard to display the floorplan, the location of firefighters, even seeing what other firefighters are seeing.
Firefighter example
Wireless, sensors, mobile, wearable
Big displays, electronic paper
Tracking people inside
Photos, firefighter w/ eqpmt, building with fire, people in and out a door, command center
Point of this slide: But privacy is a serious obstacle to long-term success of ubicomp.
Because of all of the privacy risks with ubicomp, both real and perceived, privacy may be the greatest barrier...
Information can be used for benefit and for harm
“Orwellian Dream Come True”[New York Times 1992]
Privacy is a well-known issue with existing tech
Big Brother as well as lots of Little Brothers
How do we know location data is used only for emergency response purposes?
Especially since emergencies are rare
What about other apps built on top of same sensor infrastructure?
Too expensive to deploy sensors just for one app
Will likely need different levels of privacy protection for apps where privacy vs utility tradeoff not as obvious
There are real fears about misuse of data
X10 ad with camera spying, don't know being tracked
Ubicomp makes these worse
Merge this and next slide together
Privacy is not a new problem, goes back all the way to Hippocratic Oath
Old Hippocratic Oath
"What I may see or hear in the course of the treatment or even outside of the treatment in regard to the life of men, which on no account one must spread abroad, I will keep to myself, holding such things shameful to be spoken about."
Privacy is not new to technology either, many well-known issues
One potential fear is Orwellian Big Brother, omnipresent force always watching you
However, perhaps an even greater danger are Little Brothers, all the people and organizations around you
BusinessWeek http://www.businessweek.com/technology/content/jun2002/tc2002065_2710.htm
Faceless Snoopers Have the Upper Hand
"The issue is that information can be used as a weapon," says Gartner analyst Hunter. He points to the Colombian guerilla group Revolutionary Armed Forces of Columbia, which allegedly searches credit histories to locate and target rich kidnap victims. "We haven't seen this kind of abuse [in America], but that's an indicator of how bad things can get," Hunter warns.
Point of this slide: So I am trying to tackle two related issues, how to analyze privacy, and how to implement privacy-sensitive systems.
First part is to analyze privacy.
Second is, even if we know what we want to build, how do we actually build it?
Too much of the focus has been on getting sensors to work
What about the data? What about privacy?
Problems:
Too easy to collect information about people without their knowledge or consent
Too easy to share that information with others
Too easy for people to collect large amounts of information about people
Make it clear that I don’t believe in purely technological solutions
Lessig framework
Make it easier to do the right thing
Regulate the flow of information
Limiting quantity, quality, and scope of information
Greater feedback and control about information
IN TO YOU ABOUT HOW IT'S BEING USED
Applications
Person Finder & Building Emergency Response Service
Evaluation
Ease of building applications
Point of this slide: To do this, AIF to analyze, and Confab middleware to build
Tie it back to the firefighter example
InfoSpaces, repositories ex. My name, location, etc
Mechanisms, controlling flow between infospaces, ex. Firefighter emergency response location
Point of this slide: Information privacy is the issue here, which is really about control and understanding, not secrecy (which people often confuse privacy with).
The most useful framework I've seen separates privacy into four different types.
The first is Territorial Privacy, privacy of physical places. For example, it is inappropriate for others to just wander into your home, and for government officials to enter your home without just cause.
The second is Bodily Privacy, privacy of the physical person. For example, it is socially unacceptable for strangers to physically touch others without consent.
The third is Communications Privacy, privacy when parties are communicating with one another. For example, I have a reasonable expectation of privacy when I talk to my brother on the phone and when I send him email.
It is the fourth kind of privacy we are most interested in, Information Privacy, or privacy about individuals. This includes things like my name, my address, my current location, and my current activity.
"Determine for themselves"
Territorial and Bodily and Communications privacy mostly covered by 4th amendment (unreasonable searches and seizures)
“Information privacy," an individual's right to control his or her personal information held by others
Four kinds of privacy from David Banisar of EPIC
Defining Information Privacy
Really about control and feedback
Controlling when information about you is disclosed
Feedback about who knows what about you
Civil right
UN Declaration of Human Rights, Hippocratic Oath
Point of this slide: But control and understanding are hard due to asymmetrical information flows. One-way information flows take away control and understanding at the individual level.
Examples:
Collecting info without person knowing. In the firefighter scenario, an example would be if a person who entered the building didn't know that their location was being stored. This might be because the sensors are so small that people can't see them, or that there is no form of notification telling people that they are being tracked.
Sharing info without person knowing. In the firefighter scenario, an example would be if the person thought the info would only be used for emergency response, but turns out that their boss has also been using the info to keep tabs on them.
Information flow a specific case of information asymmetry
Modify diagram to show three entities?
For info asymmetry example, tie together with firefighter story
Too easy to collect, for example newcomer doesn't realize they are being tracked
Too easy to share, for example building manager might share with others
Information Asymmetry developed in the 1970s by George A. Akerlof, A. Michael Spence, and Joseph E. Stiglitz
Bob provides Dynamic information, how long the lines are, what's the traffic, current events
Carol is a tour operator, sets up new tour packages, wants to know a lot about who went where
Better targeting for customer needs
Need plausible story about degraded
Why would the advertise want degraded data?
Demographics, tourist packages
Government too (who goes where)
Sweeny examples?
Setup this scenario better
Add picture of Cyberguide picture etc
Do animations here of GPS, Cyberguide, etc
Who is Alice? What is she doing? She has a PDA, wants to tour Paris
Here is another ubicomp scenario, Alice has GPS, PDA, wireless networking
Give Alice more understanding of what data, and how data will be used
Also, she chooses to decide to try
Give Alice more control over what data flows out when
Again, give Alice more control over the flow of data (what data, when)
Minimizing flow out, even when Alice is not involved directly
He might do this because of a contract (when Alice first signed up)
Legally required
Or because competitors are doing it as well
A bunch of techniques, is there a systematic way of analyzing asymmetries so that you can minimize it?
Can we provide basic operations for manipulating data?
These would reside in the InfoSpace and in the data
To do this, we need to take into account three different questions
Where does the data live?
What happens with the data?
What can people do to protect the data?
These perspectives of data flow are relative to the end-user
Same here, we need a motivation, tying this in to everything we had before
Easiest to describe this with email example
Prevent – Anonymize email, email remailers, blotting out identifiability of people in message
Avoid – Asking other people if they trust the recipient of the email to “do the right thing”, or putting a message at the top saying “please don’t forward”
Detect – Google search seeing that someone posted your email
When starting, I focused on prevention
Digital Rights Management, Encryption, Mobile Code
But prevention only goes so far
Now, focused more on avoidance and detection
Increase transparency of Ubicomp systems
"Trust but verify"
Point of this slide: Information Privacy will need these four forces to succeed, but asymmetry in ubicomp seriously impedes these other forces. Or, a given architecture makes it easier or harder to do certain things. Give lots of examples here to make these abstract concepts concrete.
The key insight is that privacy cannot be managed by tech alone
I have created technology that allows these other forces to be brought to bear on the privacy problem
Focus on prevent, avoid, detect // collect, access, second use
Foreshadow privacy techniques from later on
Fidelity of what is captured, transparency (ie an notetaking device that captures just notes, displays as it captures too)
Controlling scope
Push data to edge
Bugs in data? Seed with fake data to detect?
How Information Privacy is shaped by these four forces
Market, people less likely to buy products or use services from untrustworthy providers. Example is Toysrus.com, fed shopper data to a data analysis firm without notification or consent, but once word got out, customer complaints forced them to stop this.
Social, self regulation. Example is how doctors and lawyers have a code of ethics, saying what is and is not acceptable, and they help enforce this code on one another. Doctors and lawyers are expected not only to follow this code, but to make sure that their colleagues do as well. Another example is with us. I'm sure many of us, if we really wanted to, could figure out how to read each other's email. Why don't we? Socially unacceptable, and if caught, would face serious social sanctions.
Legal, a Federal order prohibits US government web sites from using cookies. Another legal force is the idea of treating personal information as property owned by individuals that can be sold only by that individual.
Technology, IE6 and cookies, empowers end-users with greater control over what information about them is collected and when
But Asymmetry impedes these forces
With respect to Market, asymmetry makes it harder for people to make informed choices. Because people didn't know how Toysrus was using their info, or that Toysrus was actually misusing their info, they couldn't make a better choice.
With respect to Social, asymmetry makes it hard to apply social sanctions to people that violate privacy. So, if people don't know that their doctor or lawyer if violating their confidentiality, or if people don't know that others are reading their email, they can't do anything about it.
With respect to Legal, asymmetry makes it hard to pass and enforce new privacy laws. For example, one reason why personal information isn't being treated as property is because it's hard to detect cheaters.
A given design can make it easier or harder to apply these forces.
One example is with cyberstalking. There was a recent story about a stalker that bought information about where a woman worked, and then went there and killed her. One can imagine a different architecture, one that notifies people that someone is asking for their information. That person can then choose what to do, and can also apply social forces or even legal forces to help address the problem.
Different ubicomp designs also have different levels of asymmetry.
So the question here is, what kind of ubicomp design should we have?
Another example is with us. I'm sure many of us, if we really wanted to, could figure out how to read each other's email. Why don't we? Socially unacceptable, and if caught, would face serious social sanctions.
Why is asymmetrical information flows a problem?
Examples of how these forces are used to help manage privacy but are inhibited because of technology
If the technology is not built right, it makes it harder to do certain things
Ubicomp often exhibits asymmetry, in fact sometimes no information
Technology influencing privacy (Phil Agre too?)
Yahoo IP and France
Technology influences Legal, Market, and Social forces
Depending on how they are built, legal forces can be enforced (Napster ex) or cannot (ex. Gnutella or Freenet)
Who the users are
For example, Napster and Gnutella, the way they were built influences what can and cannot be done
We want to build it so that these other forces can be brought to bear
E911 example for ubicomp asymmetry
Low barrier to entry
Easy to deploy, easy to program
Talk about why not do pessimistic case? Rather than just saying not doing it. Because nearly impossible if no one cares
Online community for social privacy?
When building and deploying ubicomp apps, there are at least two different cases.
First one is pessimistic case…Second is optimistic case.
Research Idea: Getting things to talk with one another
Big stick principle, possession is 9/10s of the law, he who has physical access has control
These objects have rotating codes, rotating barcodes for example
Codes last for variable time, ex. 1 hour, 24 hours, month, year, forever
Come back to these on evaluation
Make all arrows go up
Alice and Bob example or firefighter example
Use a generic version of this slide for architectural overview, more clouds, data moving between spaces
Information flows from sensors to spaces to apps
For example, in emergency response scenario, xxx
Information flows from sensors to spaces to apps
Privacy techniques implemented as operators
Who controls the ontology of data formats?
Decentralization of data is a key design decision, makes some apps harder to implement actually, but important for privacy
You know where your data lives, it belongs with you
Roughly speaking, can consider it as an Object with properties that can be queried
Default value for all properties is UNKNOWN
Sources
History of tuple (where it came from, where it has flowed)
Operators can chain together
Operators can chain together
What do these have in common?
Reducing quality of information out
Increasing quantity and quality of information
Scoping and control over flow of information
Not just prevention, but also avoidance and detection (transparency of system)
Tourguide example again here
Address in Paris, Neighborhood, Paris
Why is it still useful to Alice?
Alice gets her exact location (put this on the original Alice / Bob example)
Bob gets what neighborhood she is in, what events etc, tied to Alice
Carol gets aggregated data
Similar DRM techniques
Backup slide, DRM is one piece, here’s the rest of it (and define DRM)
Sort of mimic cell phones, someone asks "where are you?"
Enforcing and verifying location harder, going to need hardware support to do this
Insight: I know they should be sending this data to me, but they aren't, they should have deleted this, but they didn't, go to court
Inference techniques to highlight unusual entries
Highlight new interesting research areas
Information overload problems
Can seed with fake people, to make sure that other InfoSpaces are doing the right thing
Inference techniques to highlight unusual entries
Highlight new interesting research areas
Information overload problems
Can seed with fake people, to make sure that other InfoSpaces are doing the right thing
Inference techniques to highlight unusual entries
Highlight new interesting research areas
Information overload problems
Can seed with fake people, to make sure that other InfoSpaces are doing the right thing
Set the bar high
Eliminate excuses service providers have for not doing this by making it easy
Legally required
Defaults
Logging
Can collude FTC can detect
Pessimistic case? Bob and Carol can collude
Can seed with fake data
Combination with watchdog
Techniques like watchdog agencies, consumer reports
Also have to do this for every instance of Carol
Isn't this easy to defeat?
Fake data to detect violators
If Carol detects that she should not have the Tuple, then deletes it
If Carol detects that Bob should not have had it either, then notifies Alice
An example of peers checking each other, like doctors and lawyers (code of ethics, honor code)
Confab Core
12 KLOC (10KLOC) / 13 KComments
261 classes (200)
Libraries (Shrew / parts of Guirlib used)
6500 LOC / 10000 LOComments
148 classes
Confab Core
12 KLOC (10KLOC) / 13 KComments
261 classes (200)
Libraries (Shrew / parts of Guirlib used)
6500 LOC / 10000 LOComments
148 classes
Describe person finder better, maybe AT&T cell phone friends service, m-friends, m-life
http://www.instantmessagingplanet.com/wireless/article.php/1454981
Two mMode users can find one another, too, using the system's locator-based feature. At the demo, the wireless network located our location to within a quarter mile of our actual location. Privacy is a top priority with AT&T Wireless, too -- a user can become "invisible," so that no one can determine his or her physical location.
http://www.wirelessnewsfactor.com/perl/story/18391.html
There You Are
Find Friends is available as part of the AT&T Wireless mMode service. Company spokesperson Danielle Perry told Wireless NewsFactor users receive notification that someone is trying to locate them, and parties can then call, send a text message, or arrange a meeting using location-based information.
Perry said users of the service could prevent being located by turning their phones off or using an "invisible" setting. Customers can also change their lists or "revoke a friend," Perry added.
Need much better reason for not using real sensors
Need a good story
First prototype to be on a right track before spending time and effort to build and deploy
Future work is deploying
Another reason is that sensor focus
We built a basic prototype of this service
Process of evaluating it with firefighters and end-users and iterating on the design
Show the InfoSpaces, the operators, the data flowing
Get rid of service descriptions
Teach me the right way of doing HCI
Field study Iterative design picture
First step is field studies, we rode out on truck
We did this first before designing the app
Look at Anind's paper on middleware for roadmap
2 slides with details
How easy?
LOC for apps
Now others are creating apps
Other apps by other people (Jeff Heer and Alan Newberger)
C++ PDA Suite of Emergency Response apps
And running apps by users
Spend more time on this, more details
How will I do this? This is a reasonable plan, more details
My work makes this long-term evaluation possible
We don't have to satisfy the fundamentalists to succeed, we should target the pragmatists and the unconcerned instead
http://www.aicpa.org/download/webtrust/priv_rpt_21mar02.pdf
Note that this data is from 2001, actually a spike in 2002 in fundamentalists to ~33%, not clear if that is short-term or long-term
Also, not clear how this might apply to ubicomp, since this is really about commerce and government
ie this is a best guess
So what is the right level of asymmetry?
Well, it depends on the task, the people, and the application
Some applications would require very little asymmetry (ecommerce), while others would have lots (national security)
Also depends on audience, given that largest segment is the pragmatists, most likely need to satisfy those
http://energycommerce.house.gov/107/hearings/05082001Hearing209/Westin309.htm
Committee Hearing
The House Committee on Energy and Commerce
W.J. "Billy" Tauzin, Chairman
Opinion Surveys: What Consumers Have To Say About Information Privacy
Subcommittee on Commerce, Trade, and Consumer ProtectionMay 8, 20013:00 PM2123 Rayburn House Office Building
<number>
We don’t have the tried and true theories of other fields. Can’t follow a book, build it, and expect it to work.
The best tools let you enter a design, create an interactive prototype, & support testing it
This is the philosophy that our work supports!
<number>
We don’t have the tried and true theories of other fields. Can’t follow a book, build it, and expect it to work.
The best tools let you enter a design, create an interactive prototype, & support testing it
This is the philosophy that our work supports!
<number>
We don’t have the tried and true theories of other fields. Can’t follow a book, build it, and expect it to work.
The best tools let you enter a design, create an interactive prototype, & support testing it
This is the philosophy that our work supports!
<number>
We don’t have the tried and true theories of other fields. Can’t follow a book, build it, and expect it to work.
The best tools let you enter a design, create an interactive prototype, & support testing it
This is the philosophy that our work supports!
<number>
We don’t have the tried and true theories of other fields. Can’t follow a book, build it, and expect it to work.
The best tools let you enter a design, create an interactive prototype, & support testing it
This is the philosophy that our work supports!
Here are some other areas I'd like to work in
Future research plans
Recycle from research summary, keep moving in these area
Eval
Design methodologies, design patterns for other domains
How patterns work
Toolkits
Design tools
Emergency Response
Evaluation
SATIN, WebQuilt, Denim, Book (I've got breadth)
Here's an example, not a real UI
Mockup of a real UI, learns?
Overload?
Part of this insight came from limitations of preventions.
Other part of this insight came from observation that technology cannot manage privacy alone, and avoidance and detection would be more useful for other forces
Email example
Outside of administrative domain == no control
Two problems
Prevention only goes so far
Prevention doesn’t provide transparency
Privacy is primarily about trust
Security is primarily about prevention on access
Email example, send it to a person, what assurances are there?
Eventually, they will need to see the data to provide their service to you
When I first started, yeah, we can solve privacy
Trusted Computing Base, DRM, but unlikely, outside of administrative domain == no control
Secure Mobile Code, unlikely
Encryption, still need to decrypt the data to use it to provide you my service
Insight: rather than focusing on prevention, focus more on avoidance and detection mechanisms
Similar to how Western governments work, transparency is key here
Greater data collection => greater oversight and transparency
We as researchers should focus more on avoidance and detection mechanisms
Make it really easy, eliminate excuses
Defaults are set for high privacy
Also, early in ubicomp, great opportunity to help establish norms of usage. By setting people's expectations high, make it easier for legal and social and market forces to come into play.
Peer enforcement
Minimizing quality and quantity
Two insights:
Privacy will not be solved, instead managed
Solve via market, legal, social forces in addition to technology
Less prevention, more on avoidance and detection (transparency) makes it easier to apply these forces
What’s nice about this is it gives a systematic way of thinking about privacy in systems
Prevent, Avoid, and Detect inspired by Dijkstra’s deadlock
Also related to Schneier’s outlook on security, need plans before, during, and after a break-in
Privacy mention
InfoSpace image with techniques and operators modifying flow
List of operators on side if possible
Real issue with privacy is one of control, feedback, and understanding
The origins of ubiquitous computing research at PARC in the late 1980s
by M. Weiser, R. Gold,† and J. S. Brown.‡
http://www.research.ibm.com/journal/sj/384/weiser.html
Decouples sources from sinks
Versus RPC
Level of indirection
Asynchronous and connectionless helps scale
Communication between sources and sinks separated in space and time
Ruple diagram
“You know it when you lose it”
David Flaherty, reminiscent of Justice Potter Stewart's attempt to define obscenity: "I know it when I see it."
Efficiency quote Vance Packard, NYTimes Magazine article
Kant describes people as ends in themselves, rather than means towards an end
The Trackable Society, Arnold Kling
http://www.techcentralstation.com/1051/techwrapper.jsp?PID=1051-250&CID=1051-102102E
Gary Marx, not Karl Marx
Privacy and Technology
http://web.mit.edu/gtmarx/www/privantt.html
Protecting Electronic Health Information
Committee on Maintaining Privacy and Security in Health Care Applications of the National Information Infrastructure, National Research Council (Washington, DC 1997)
In his book Privacy and Social Freedom, Dr. Schoeman argues against the idea that the realm of privacy is or should be a domain in which people are free from social institutions and control. On his view, social norms pervade all aspects of life, and a person's ability to achieve his or her goals depends on these norms and institutions.
Low barrier to entry
Easy to deploy, easy to program
Privacy
Scoping and control over flow of information
Aune, R Kelly; Waters, Linda L
Cultural differences in deception: Motivations to deceive in Samoans and North Americans.
International Journal of Intercultural Relations, 1994, 18, 2, spring, 159-172
Examines variance in motivations for deception arising from cultural differences inherent in collectivistic & individualistic cultures, using questionnaire data collected in North America & American Samoa (N = 41 respondents [Rs] in each). The more collectivistic Samoan participants indicated they would be more likely to attempt to deceive another when the deception was related to group/family or authority-based concerns. US Americans indicated they would be more likely to lie to protect their privacy or the feelings of the target person. 1 Table, 1 Appendix, 19 References. Adapted from the source document.
cost of buy-in high (barrier to entry)
complexity high
AI approach, cost-to-benefit
kitchen-sink
problems it solves?
Aune, R Kelly; Waters, Linda L
Cultural differences in deception: Motivations to deceive in Samoans and North Americans.
International Journal of Intercultural Relations, 1994, 18, 2, spring, 159-172
Examines variance in motivations for deception arising from cultural differences inherent in collectivistic & individualistic cultures, using questionnaire data collected in North America & American Samoa (N = 41 respondents [Rs] in each). The more collectivistic Samoan participants indicated they would be more likely to attempt to deceive another when the deception was related to group/family or authority-based concerns. US Americans indicated they would be more likely to lie to protect their privacy or the feelings of the target person. 1 Table, 1 Appendix, 19 References. Adapted from the source document.
Aune, R Kelly; Waters, Linda L
Cultural differences in deception: Motivations to deceive in Samoans and North Americans.
International Journal of Intercultural Relations, 1994, 18, 2, spring, 159-172
Examines variance in motivations for deception arising from cultural differences inherent in collectivistic & individualistic cultures, using questionnaire data collected in North America & American Samoa (N = 41 respondents [Rs] in each). The more collectivistic Samoan participants indicated they would be more likely to attempt to deceive another when the deception was related to group/family or authority-based concerns. US Americans indicated they would be more likely to lie to protect their privacy or the feelings of the target person. 1 Table, 1 Appendix, 19 References. Adapted from the source document.
Low barrier to entry
Easy to deploy, easy to program
Privacy
Scoping and control over flow of information
Lee Rainie testimony, Pew Internet
http://energycommerce.house.gov/107/hearings/05082001Hearing209/Rainie308.htm
Sarah Spiekermann, Jens Grossklags and Bettina Berendt (2001) E-privacy in 2nd generation E-Commerce: privacy preferences versus actual behavior, in: Proceedings of ACM EC'01: Third ACM Conference on Electronic Commerce, Association for Computing Machinery, Tampa, Florida, US, pp. 38-47. (Paper Acceptance rate < 20%)