New digital welfare (2011-2015)Welfare technology is vital to future welfareservices. In recent years, the public sector hasinvested in welfare technology and is now in astrong position to exploit IT and new technologymore intensively to modernize and optimize publicservices such as schools, the health service andeldercare. Providing good service does notnecessarily require a face-to-face meeting and, inmany cases, digital solutions can provide citizenswith a more modern and effective service.
Privacy & DigitalIdentityAnja Bechmann,Ass. Professor & Head of Digital Footprints Research Group,AUVenue: Invited speaker for internal workshop at Danish Agency for Digitisation,Copenhagen, May 2, 2013
• 1. who am i, how do we investigate digital footprints?• 2. Lessons learned: what is personal and what should be protected? How?• 3. Danish Agency for Digitisation projects - discussion
• Digital Footprints is a research group (at Centrefor Internet Research, Aarhus University, Denmark)interested in the data that users share, expose ortrade when communicating through the internet. Theresearch group is dedicated to collect, analyze andunderstand digital footprints, the character of thefootprints, the context(s) they form and in whichthey are given, and the purpose of the individual/group for sharing, exposing and trading data.
• The Social Library (Danish Agency for Culture incollaboration with REDIA)• Trust and Privacy on Facebook in DK & IT (incollaboration with Data-Mining group in Italy ledby Matteo Magnani)• Personal data sharing on Facebook among highschool/college students (18-20) in DK & US(DigiHumLab Denmark)• In preparation (un-financed): Optimizing digitalidentity onlineMy API ResearchProjects
ethno-miningis unique in its integration of ethnographic and data mining techniques.This integration is carried out in iterative loops between the formation of interpretations of the data and the development ofprocesses for validating those interpretations. (…) here are two key characteristics of the iterative loops in ethno-mining.First, they can be separated into three categories based on the amount of a priori knowledge used to find and validateinterpretations of the data. Second, the results of the iterative loops are frequently, although not exclusively, representedin visualizations. Visualizations have two basic affordances: they can represent both quantitative and qualitative analyses,and they exploit the visual system to support more comprehensive data analysis, particularly pattern finding and outlierdetection.(…)our method seeks to expose and explicitly address the selection biases in both qualitative and quantitative research methodsby checking them against one another. Ethno-mining extends its scrutiny of these biases beyond simply comparing the biasesembedded in standard qualitative and quantitative techniques. It does so by tightly integrating the techniques in loops,generating mutually informed analysis techniques with complimentary sets of biases.“Ethno-Mining: Integrating Numbers and Words from the Ground Up by R. Aipperspach, T. Rattenbury, A. Woodruff, K. Anderson, J. Canny, P. Aoki.
Source: Bechmann,A. Internet Proﬁling:The economy of data intraoperability on Facebook and Google, Mediekultur (submitted).
Lessons learned: what is personal and what should be protected? How?
“it’s because it’s ourprivate space [danish:forum]. We cannot seeeach other every day sothis is a way ofkeeping up to date witheach other...withthings not everyoneshould know...if youhave met a guy or if weare going to meet”
• Not personal data (on timeline): “I only postthings that I am not ashamed of [fit self-image,ed.]and then I don’t care who sees it”• Personal data: “sad things” (death) and “thingsthat you do not want to be confronted with”,private address, face (pictures more personal thaninformation) and account info (in US also drinkingand religion)and not comment on historical data sothat they resurface
Control and secure the digital identityThe students were very interested in seeing their own Facebook data asshown to third parties.They are annoyed by the lack of transparency indata ﬂows to third-party apps that make them unable to manage theiridentity properly.
How does that correspond with legislative deﬁnition?
personal data shall mean any information relating to an identiﬁed or identiﬁablenatural person (data subject); an identiﬁable person is one who can be identiﬁed,directly or indirectly, in particular by reference to an identiﬁcation number or toone or more factors speciﬁc to his physical, physiological, mental, economic,cultural or social identity;Informed consent: the data subjects consent shall mean any freely givenspecific and informed indication of his wishes by which the data subjectsignifies his agreement to personal data relating to him being processed.Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals withregard to the processing of personal data and on the free movement of such data
• in particular want to protect:• identifiable pictures• respect of ‘do not profile/confront me with this info-tags’• private address
The use of data• what do they think of companies accessing datathrough apps (average 60 apps in sample)?• (not oriented towards apps as companies - more asservices or employers trying to profile them - notconcerned)• showing them what we draw-> “it is taken out ofcontext...it is a little bit silly now [facerapeprank, ed.]”• things that they think are private should not bemade public• Understanding (transparency) leads to acceptaccording to the users and when in doubt of whatapp does they avoid it (unless referred to a lot oftimes through friends) (according to themselves)
• Home address• Work address x 2• Workplace x 2• Job description• working hours x 2• Education, interests, preferences• ->more qualified offer for parents• ->maybe prevent bad offers for more parents bycorrelating data
• accessibility - security problems• but what about:• user micro-managing of the information connected tothe authentication travelling with the user (NEMID)• co-existing authentication systems or integrationwith private companies• How should digital identity solution incorporatedigital life in general?
Q’s for you• should digital identity solution (eg. NEMID)incorporate digital life in general?• what kind of demands for privacy do you consider inyour solutions and strategies?• how do you deal with accessibility? (platform,server, device, solution)• how do you/we secure that consent is in factinformed when we know from statistics that peopledo not read the text/permissions? (privacy paradox)• how do you make solutions sensitive to the needsand everyday life of the user?
THANK YOU.Anja Bechmann, Ass. ProfessorHead of Digital Footprints Research Group, AU+45 5133 email@example.com // @anjabechmannDigitalfootprints.dk // @digifeet(Visiting researcher from 1.8.12-1.8.13, DIKU Copenhagen)