I’m Katie Shilton Assistant professor, University of Maryland Talking today about *participatory* personal data – what it is, its relationship to mediated social participation, and some of the challenges and concerns it raises
Broadly interested in emergent types of personal data, new data collection practices & technologies Familiar data tools like Facebook, Twitter, and Google services, new data collection applications built by amateur enthusiasts, small research labs. Recent work investigates design of emerging technological platforms that allow people to collect, analyze, and make decisions based on data about themselves and their communities. Participation with data, or “participatory personal data.” Collecting this new kind of personal data enabled by increasingly available personal devices, including mobile phones and tablets. These devices create a sensor network to allow individuals and groups to gather, analyze, and report on data about their environment and lives. Mobile phones: Billions of users Familiar tools Can collect and upload images, audio, location data Can connect to other types of sensors using Bluetooth, etc Mobile phones can be used for systematic data gathering by 1 person or thousands… a new kind of participatory research
Example for context: Boyle Heights Development project Mixed-income neighborhood in Los Angeles. Initiated by coalition of community groups working on a community needs assessment. Community orgs recruited residents to carry phones running specialized software Participants used phone app = location-based surveys. Popped up = location Asked ?s about neighborhood, daily route, work, home, school environments. Collected location traces, and users collected images. Image here of the data entry screen, map of location trace Groups involved in project using data to contribute to a larger community development plan.
Different example = individual health monitoring tools –mobile health or mHealth. Mobile phone to send and record targeted, location & time-based experience samples . Used by patient s to manage chronic illness, or r esearchers to track interventions & outcomes. Samples can be user initiated (for example, if you remember to enter one while you’re eating) or prompted (if you haven’t entered one in a few hours) E.g.: sleep study might use phone’s alarm to detect when you wake, send sleep survey. Food survey might be triggered when enter a restaurant, or arrive home after commute. Specialized software to help analyze data (meals, locations, self-entered data like moods, stress levels and sleep quality) Patterns in data might help adjust medication, plan relaxation techniques or exercise, or plan interventions like a text message from a buddy, self-recorded reminders, etc. To do this, participants collects GPS location trace, text data, images of food or the people you with. Data might be shared with clinicians, coaches, or medical researchers, family or friends.
As these examples suggest, part data applications can capture extremely useful, but also extremely sensitive information. A recent response to a white paper we published put it nicely, I think. An EPA employee responded to a listserv posting with: “ … [Participatory sensing] is either terrifying or electrifying, depending on who does it and how the data are processed…. And she’s exactly right. The two huge “Ifs” in mobile sensing are the who – someone you trust? The government? Advertisers? Doctors? University researchers? And the how is it processed? What can be inferred from this data?
What differentiates participation in data collection & use from surveillance? OR: When does recording data about self, community shift from social participation to surveillance? Suggest at least 4 conditions (from the surveillance studies lit) that distinguish participatory data collection from surveillance. A 1st is making decisions about when & how to collect data – Consent . Do individuals know they participating, and do they know what that means? The challenge: what does “informed consent” mean when data collection, devices so close to our everyday lives? A 2nd is having access to our own data and controlling its flow - Privacy : Who can see data when, how much can participants hide? A 3 rd is preventing people with power over you – corporations, law enforcement – from using your data against you - Power & equity : Collecting, analyzing data about others source of power in an information economy. Who benefits? Will your data be used in hiring decisions, insurance decisions, product pricing decisions? A 4th is being able to wipe your slate clean, or begin again - Forgetting : total memory, accountability for actions, routines?
No ? continuously recording location, activities invasive. Few or no laws re how to collect, protect such data. ? of social values: what moral, human values important here? The conditions I named – privacy, consent, equity, forgetting – are all values that can guide design. Larger movement of people trying to figure out what values are or should be designed in. My diss work = part of this trajectory. Social, ethical challenges lead me to want to know: how do these challenges affect tech development? What makes social values a part of the design discussion?
Dissertation project = ethnographic investigation of social values in the design of tech @ CENS Center devoted to developing sensor networks. Engaged in design of software, infrastructure for collecting data about people using phones. Spent two years as a participant observer in this lab, taking fieldnotes, conducting interviews, and recording impressions of design practices in this lab. Access to CENS because I work there – invited to “deal with privacy”. Participant observation w/ focus on “participant”. Began by getting oriented to CENS design Approx 30 researchers – faculty, staff, students Meetings At least 1 full-group meeting most weeks 2-3 smaller project-driven meetings Multiple ad-hoc meetings per day – people gathering to hash out problems
Looked for: activities that encouraged designers to pay attention to, talk about, design for privacy, consent, equity, forgetting. I named these activities “values levers,” because they pried open new conversations about social concerns, & helped build consensus around social values as design criteria. Started looking for values levers and noticed their position within design: Noticed some were activities inherent to design; others were introduced by agents such as leaders or the institution
Levers already part of design – inherent to the setting First a-ha moment: meetings w/ statisticians: stat comments, interests continually refer designers back to issues inherent in the data. Mathematical discussions, but also ethical debate about data representation, sharing, security. When designers work across disciplines, need a common reference point. Data: lingua franca of design. Working on interdisciplinary teams = values lever Another lever related to data = funding. Bigger teams, more interdisciplinary conversations, more hired programmers, more attention to data. Another lever was experiencing internal testing. When developers tracked themselves, they started to notice privacy, consent, equity challenges.
Common practice @ CENS: internal pilots Meant to test for bugs, usability Interviews: Hesitation to test sleep or food studies when peers analyzing data. Laboratory pilot tests = value lever QUOTES Border crossing by director – super secret quick route. She sent it to me – you have to use this! Evocative – self-testing, and of things that designers might not want to share. Self-testing, like interdisciplinary teams, f osters focus on the data . Kinds of data under request made surveillance and privacy concerns concrete.
Some levers introduced – not accidental E.g.: user feedback. CENS leaders: explicit decision to involve users. User concerns about privacy, consent, in carried lots of weight – Postdoc and students who worked w/ health pilots, for example, took user privacy concerns w/ medical info very seriously. But users express privacy concerns in contexts where you’d expect (medical). User concerns = contextual. Sometimes indifference carried too much weight. Local campus users had few concerns – students took this to mean “no one cared about privacy.” User feedback = less explicit lever, so less effective. More effective: Mandates imposed by UCLA’s IRB Values advocacy by leadership = normalizing values conversations @ CENS
Unsurprisingly, leader advocacy can have a huge part in helping successful levers flourish. 3 main PS leaders at CENS took privacy, consent, equity, forgetting very seriously Would raise these concerns during meetings, when reviewing what projects would go forward, etc. But they also set lab procedures that functioned as values levers. Leadership mandated an online form for all data collection, assigned administrator to enforce. Report filled out when someone requests phones. Formalize anti-surveillance values, clear that values are important, taken seriously. Also help advocates, mentors, look out for possible conflicts.
Analyzing my field notes = Process of moving from relatively abstract social values to design decisions = complex. At CENS = roughly three phases. First: agreeing on the values of importance. This is the role of the values levers. Then: operationalizing values as design principles. Targeted, concrete goals for the system. Then: operationalizing those principles as tech features and affordances. Each step a cooperative process of translation between lab leadership and development team.
Values levers made discussion of values – privacy, consent, equity, forgetting – part of the daily work of design. Translating those values into design principles & then features created some values-based technology solutions. One examples solution was the Personal Data Vault – a secure repository for all data collected with personal devices. The PDV kind of a bank account for your data – a place to keep, monitor over time. Data goes here first; participants can use a set of sophisticated filters to share subsets of the data with third parties. This idea is popping up in several places – most recently, in commercial start-up Personal.com.
A values-based design process can inspire tech creativity, but more than tech solutions necessary to protect participatory personal data. Real need for new legislation to protect personal data flowing thru the mobile data ecosystem. Data collected by mobile apps, phone companies, not protected under current law. Even health data isn’t covered by HIPAA, not created by medical providers. If data vaults are going to be part of the solution, they are going to become real honeypots. Protect this data from subpoena. Precedent for protective laws to encourage participation. E.g.: Genetic Information Nondiscrimination Act to encourage people to participate in genomics projects by protecting them against discrimination based on their genetic data. Perhaps we should have similar protections for personal data. The Federal Trade Commission is currently starting a multi-stakeholder process to discuss privacy for mobile application data. I’ll be watching and participating.
Both of these issues – tech and policy solutions – emphasize need to understand an entire data ecosystem. Dozens of players in this space: mobile app start-ups, telecomm providers, regulators, users. Beginning an ecosystem study of start-ups, telecommunications companies, federal regulators, and users. Interviews, participant observation focused on privacy challenges and decision-making among these groups. In particular, investigating when do privacy concerns hinder new innovations? And when do they inspire new creativity or new innovations? Also a need for research on users in this space. What are their data collection practices, how long do they participate, and what are their motivations for doing so? What are their values and concerns? Conducting a survey of privacy concerns in context With colleagues Beth St. Jean & Brian Butler, beginning interviews to study motivations for participation. But lots more is needed.
For researchers of the sociotechnical, those interested in technologically-mediated social participation, big questions at the intersection of new tech trajectories such as mobile data collection, and social changes they may engender. There’s a lot of room to investigate and understand how these technologies affect users as well as institutions, And how we can best design technologies to reflect fairness and social justice. And researchers can be part of design. We bring understandings built on professional ethics, research ethics – many from the humanities and social sciences have a real values tradition. Defining and operationalizing these values can be a skill set for design.
Katie Shilton, "Participatory Personal Data"
PARTICIPATORYPERSONAL DATAMEDIATED PARTICIPATION & SOCIAL CONCERNS2012 Summer Social WebshopKatie ShiltonAssistant ProfessorUniversity of Maryland
Example: Mobile HealthData collection for participant self-care, clinical care, & research http://openmhealth.org/
Social ConcernsAs an Environmental Protection Agency (EPA) employee put it:…[Participatory sensing] is either terrifying or electrifying, depending on who does it and how the data are processed….
Surveillance or participation? What differentiates participation with data & surveillance?• Making decisions about when, how to collect (consent)• Accessing data and controlling its flow (privacy)• Preventing powerful corporations or law enforcement from using it against you (equity)• Being able to have a clean slate (forgetting)
Study: Designing for ValuesEthnography, action research at the Center forEmbedded Networked Sensing (CENS)Qualitative study of design Participant observation Interviews Document analysis 8
Findings: Values LeversActivities that build consensus aroundsocial values as design criteria• Some levers are activities already present within design;• Others introduced to design by leaders, advocates.
11Experiencing Internal TestingGrad student G: If I’m going somewhere and I don’t wantanyone to know, I think: I need to turn off thisapplication, or leave my cell phone…Grad student L: I did data collection for T., like for a weekor something. Then I felt like, not privacy ... but I felt that Iwanted to go out more actively. I felt like: oh, they arewatching me, I need to be more active.
13Leader AdvocacyAdvocacy during discussions, setting project directions, etcBut also procedures• Participatory sensing data collection form• Administrative support
Fitting Values Into Design Agree on Operationalize Operationalize Values Design Tech Features Principles • Privacy • Local control • Architecture • Consent • Legibility • UI design • Equity • Long-term • Affordances • Forgetting engagement • Data • Parsimony retention procedures
Values-Based Solutions:Personal Data VaultsSecure storage: like abank account for dataPrototypes built atCENS; Stanford;AT&T LabsCommercial productsemerging (e.g.Personal.com)
Values-Based Solutions: PolicyNeed for regulation in mobile dataecosystem • No legislation protects this data – even health data!Need to protect data vaultsProtection can encourage participation
Future Work: The Mobile Data EcosystemEcosystem study of start-ups, telecoms,regulators, usersFocus on privacy as a lever in decision-making. • When does it hinder design and policy? • When does it inspire new creativity?Better understanding users • Survey of privacy concerns in context • Study of participation motivations & practices, values & concerns
TakeawaysBig questions at the intersection of: • technological trajectories (mobile data collection) • social change (values, ethics, justice)We need to understand: • how these techs affect users, institutions • how to design socially just technologiesAnd we can be part of design • Professional ethics, research ethics = values tradition • Defining and operationalizing values = a skill set for design
Thank you! email@example.comCopyright Bill Watterson, 1991