Automatically identifying lifestyle behaviours from SenseCam images (invited talk at the London School of Economics)
Upcoming SlideShare
Loading in...5
×
 

Automatically identifying lifestyle behaviours from SenseCam images (invited talk at the London School of Economics)

on

  • 320 views

Invited talk given at London School of Economics on Friday 10th May 2013. This was part of a seminar series on "First Person Perspective Digital Ethnography". ...

Invited talk given at London School of Economics on Friday 10th May 2013. This was part of a seminar series on "First Person Perspective Digital Ethnography".

New methods of digital data capture create new problems for analysing data. How should the sheer volume of data be stored, searched and analysed? How can multiple types of first person perspective data (e.g., video, audio, location, movement, eye-tracking, biosensor) be integrated and analysed? What software platforms currently support such a diversity of data and perspectives?

Statistics

Views

Total Views
320
Views on SlideShare
320
Embed Views
0

Actions

Likes
0
Downloads
5
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Automatically identifying lifestyle behaviours from SenseCam images (invited talk at the London School of Economics) Automatically identifying lifestyle behaviours from SenseCam images (invited talk at the London School of Economics) Presentation Transcript

  • When  using  wearable  cameras  in  ethnography,  we  should  analyse  the  data  in  2  stages:    1)  Iden<fy  episodes/ac<vi<es  2)  Categorise  their  behavioural  type  &  context  British Heart FoundationHealth Promotion Research GroupAiden Doherty’s talk in 1 slide...
  • MicrosoftEarly career in computer science...
  • SenseCamHodges, Ubicomp conf., 2006My interest is now in public health...Something from department? MaybeCharlie Cochrane reviews? MaybePaul WHO document? Maybe HeartStats?
  • This talk reflects the work of many...
  • The Lancet 2013, 380(9859), pp. 2095-2128Global burden of disease...Use recent Lancet website
  • The Lancet 2013, 380(9859), pp. 2095-2128Main diseases are lifestyle related...Highlight main diseases are lifestylerelated
  • Sallis 2000, Ann. Behav. Med. 22(4):294-298Behavioural epidemiology framework...Sallis & Owen, highlightmeasurement
  • Troiano; Med Sci Sport Exer 2008; 40(1):181-8Craig; NCSR 2008 – Health Survey for EnglandThere is a big measurement problem...Self report : 50%Accelerometer: 5%Self-report : 38%Accelerometer: 5%% adults meeting physical activityrecommendations
  • Kelly, IJBNPA 2011, 8:44Armstrong, Sprts Med 06, 36:1067–1086Self-report has limitations...Recall error...Comprehension problems...Social desirability error...
  • SenseCamHodges, Ubicomp conf., 2006Current objective measures limitations...Show acc != behaviour slide … alsoGPS/GIS slides too
  • SenseCamHodges, Ubicomp conf., 2006Wearable cameras identify behaviours...Video of wearable camera data
  • Mann, Computer 1997, 30; 25-32Tano, ICME conf 2006; 649-652Past efforts focused on hardwarestorage and miniaturisation...
  • Vicon AutographerNew devices are now smaller...
  • MemotoNew devices are now smaller...
  • Google GlassBig companies like Google are nowproducing wearable cameras...
  • Gurrin 2013; Am J Prev Med 44(3):308-313Smartphones too...
  • SenseCamSenseCamHodges, Ubicomp conf., 2006Microsoft’s SenseCam is the mostpopular in health research...
  • - Active travel- Sedentary behaviour- Nutrition- Environment exposures- Contextualising other sensorsDoherty 2013, AmJPrevMed43(5),320– 323Wearable cameras in health...
  • Kelly, Intl J Behav Nutr Phys Act 2011, 8:44Active travel – U.K. NTS diary...
  • Kelly, Intl J Behav Nutr Phys Act 2011, 8:44Journey time = 20 minutesActive travel – self-report entry...
  • Kelly, Intl J Behav Nutr Phys Act 2011, 8:44journey time = 12 min 38 secActive travel – wearable camera entry...
  • All journeys+2 min 34 sec(S.E. 32 sec)Car  +2  min  08  sec  (S.E.  60  sec)  Walk  +1  min  41  sec  (S.E.  45  sec)  Bike  +4  min  33  sec  (S.E.  64  sec)  Kelly, Intl J Behav Nutr Phys Act 2011, 8:44Active travel – adults’ self-report error...
  • testThen show Paul Journey2School...Kelly, 2012, AmJPrevMed43(5),546– 550Self-report error in adolescents travel...
  • testThen show Paul Journey2School...Kerr, 2013, AmJPrevMed44(3),290– 296Sedentary behaviour...
  • testThen show Paul Journey2School...Kerr, 2013, AmJPrevMed44(3),290– 296Sedentary behaviour...ıed as “un-he camerad.were aggre-te level. Adefıned asosture andhe fırst andhe minute.-level andinute-levelusing ther each unit.entary be-d activitiesSenseCamthe classifı-ccelerome-h behavior,e 100-cpmean countstype weresitivity and0-cpm cut-ed with theclassifıca-ehavior, calculated for the whole data set andAll analyses were performed using SPSS,The accelerometer 100 cpm correctly identifıed “sit-ting” 90% of the time. However, when the SenseCamTable 2. Minutes of coded sedentary posture from Microsoft’s SenseCam by activitycategoryImage code MinutesPercent timeinaccelerometercpm Ͻ100Interquartilerange ofaccelerometercpmMeanaccelerometercpmSports 0 — 1525–2900 2330Self care 85 60 2–361 284Manual labor 202 35 41–669 432Conditioning exercise 230 21 85–1405 1262Household activity 244 58 10–305 260Riding in other vehicle 409 82 0–50 90Leisure 428 81 0–73 91Riding in car 4,653 74 8–103 101Eating 5,250 92 0–4 54TV watching 5,407 89 0–11 46Administrative activity 9,546 92 0–10 55Other screen use 22,881 93 0–0 34cpm, counts per minuteKerr et al / Am J Prev Med 2013;xx(x):xxx
  • testThen show Paul Journey2School...Kerr, 2013, AmJPrevMed44(3),290– 296Sedentary behaviour...Table 2. Minutes of coded sedentary posture from Microsoft’s SenseCam by activitycategoryImage code MinutesPercent timeinaccelerometercpm Ͻ100Interquartilerange ofaccelerometercpmMeanaccelerometercpmSports 0 — 1525–2900 2330Self care 85 60 2–361 284Manual labor 202 35 41–669 432Conditioning exercise 230 21 85–1405 1262Household activity 244 58 10–305 260Riding in other vehicle 409 82 0–50 90Leisure 428 81 0–73 91Riding in car 4,653 74 8–103 101Eating 5,250 92 0–4 54TV watching 5,407 89 0–11 46Administrative activity 9,546 92 0–10 55Kerr et al / Am J Prev Med 2013;xx(x):xxxted for the whole data set andwere performed using SPSS,The accelerometer 100 cpm correctly identifıed “sitting” 90% of the time. However, when the SenseCamimage indicated “standing without movement” o“standing with movement,” the accelerometer recordeϽ100 cpm 72% and 35% of the time, respectively. Elevepercent of SenseCam bicycling had Ͻ100 cpm on thManual labor 202 35 41–669 432Conditioning exercise 230 21 85–1405 1262Household activity 244 58 10–305 260Riding in other vehicle 409 82 0–50 90Leisure 428 81 0–73 91Riding in car 4,653 74 8–103 101Eating 5,250 92 0–4 54TV watching 5,407 89 0–11 46Administrative activity 9,546 92 0–10 55Other screen use 22,881 93 0–0 34cpm, counts per minute
  • Arab, Eu J Clin Nutr 2011;65(10):1156–62O’Loughlin 2013 AmJPrevMed44(3),297– 301Nutrition...
  • O’Loughlin 2013 AmJPrevMed44(3),297– 301Nutrition...7-dayof jocods, uproviand,repor40003500Calorieintake 300025002000150010000500UniversityTraineejockeysGaelicfootballers studentsDiary alone Diary and SenseCam*****Figure 2. Comparison of energy intake using two assess-ment methods: diary alone, and diary with MicrosoftSenseCam*pՅ0.01, **pՅ0.001TabledietarTrainGaeliUnive*pՅ0.Month 2013
  • O’Loughlin 2013 AmJPrevMed44(3),297– 301Nutrition...reporting biases.Table 1. Mean difference in energy intake between twodietary assessment methods, MϮSDMean difference(kcal)Mean difference(%)Trainee jockeys 282Ϯ164 10.7**Gaelic footballers 591Ϯ304 17.7**University students 250Ϯ216 10.1**pՅ0.01,**pՅ0.001
  • Arab, Eu J Clin Nutr 2011;65(10):1156–62O’Loughlin 2013 AmJPrevMed44(3),297– 301Assessment of environment features...that individuals may encounter during walking or cycling prevalence of car presence was found for cycling journeys.Cars driving, pedestrians, pedestrian crossing,rain, road good condition, treesCars driving, cycle lane, dark, other lights,pedestrian crossing, road good conditionTrees, walkwayTrees, walkway Cars driving, cycle lane, footpath, road goodconditionCongested traffic, cars driving, footpath,footpath good condition, grass verge, grassverge maintained, residential, retail buildings,road good condition, treesFigure 1 Sample images and exemplar coding of features present. Note: Data were collected in Auckland, New Zealand, in June 2011.
  • Oliver, 2013 Int J Health Geographic 12(20)Assessment of environment features...on journeys between home and workplace only was a prag-matic decision to ensure manageability of data treatmentand acknowledging the significant contribution work-relatedtravel makes to overall travel behaviours. Travel diaries wereconsidered the criterion for occurrence and mode of tripsundertaken. Although GPS data can be used to identifywalking and cycling journeys, trip purpose is not captured,therefore the travel diary was deemed an appropriate mea-sure of work-related journey occurrence for the purposes ofthe current study. SenseCam images were considered theTable 1 Description of environmental features present in walking and cycling journeysFeature Description Cycling(n = 599†)Walking(n = 1150†)Total(n = 1749†)n (%) n (%) n (%)Bus stop Bus stop visible in photo 44 7.3 69 6.0 113 6.5Cars driving Cars in motion or in traffic lanes on road 388 64.8 674 58.6* 1062 60.7Cars in carpark Cars parked in car park wholly or more than2/3 partially visible68 11.4 110 9.6 178 10.2Cars parked Cars parked on side of the road 190 31.7 151 13.1** 341 19.5Commercial Commercial or institutional buildings visible 281 46.9 648 56.3** 929 53.1Congested traffic More than 6 stationary cars in driving lanes 4 0.7 10 0.9 14 0.8Cycle lanes Designated cycle lane on road or footpath 16 2.7 247 21.5** 263 15.0Cyclists Any person/people riding cycles other than the participant 6 1.0 8 0.7 14 0.8Dark Image indicates journey conducted in darkness(e.g., dusk or dawn, streetlights on) but features stillvisible and image codeable†120 20.0 209 18.2 329 18.8Dogs Dogs or a lead in participant hand visible 0 0.0 4 0.3 4 0.2Footpath Footpath visible (not walkway/pathway) 338 56.4 761 66.2** 1099 62.8Footpath good condition No cracks or potholes visible 327 54.6 759 66.0** 1086 62.1Graffiti Graffiti visible 0 0.0 2 0.2 2 0.1Grass verge Any area of grass either beside road or footpath 270 45.1 504 43.8 774 44.3Grass verge maintained No obvious weeds or overgrown grass 262 43.7 454 39.5 716 40.9Litter Litter present (e.g., paper, food wrappings, etc.) 1 0.2 1 0.1 2 0.1Other lights Lights from houses, buildings or cars in photos 247 41.2 348 30.3** 595 34.0Pedestrian crossing Zebra crossings and traffic light pedestrian crossings visible 82 13.7 240 20.9** 322 18.4Pedestrians Any person/people in the photo other than the participant 63 10.5 272 23.7** 335 19.2Permanent obstructionsto cyclingTree, signage, or other permanent structure in cycleway 2 0.3 0 0.0 2 0.1Permanent obstructionsto walkingTree, signage, or other permanent structure onfootpath/walkway2 0.3 0 0.0 2 0.1Rain Rain visible 63 10.5 54 4.7** 117 6.7Residential Private homes visible 155 25.9 229 19.9** 384 22.0Retail buildings Buildings with retail/shop-fronts visible 141 23.5 165 14.3** 306 17.5Road good condition No cracks or potholes visible 462 77.1 820 71.3** 1282 73.3Street lighting Street lights visible (not including traffic lights) 209 34.9 531 46.2** 740 42.3Temporary obstructionsto cyclingRubbish bins, parked cars, roadworks, etc. in cycleways 9 1.5 11 1.0 20 1.1Temporary obstructionsto walkingRubbish bins, parked cars, roadworks, etc. onfootpath/walkway14 2.3 41 3.6 55 3.1Trees Any trees visible in photo including from a distance 441 73.6 842 73.2 1283 73.4Walkway Journey occurring in walkway/pathway(not road or footpath)45 7.5 200 17.4** 245 14.0Notes: Data were collected in Auckland, New Zealand, in June 2011.n = number of images.*p < 0.05; **p < 0.01 significant difference in features present between walking and cycling journeys; †If photo was too dark to code individual features then it wascoded as uncodeable and not included here; %, percentage of walking or cycling images where feature was present; n, number of images.Oliver et al. International Journal of Health Geographics 2013, 12:20 Page 3 of 7http://www.ij-healthgeographics.com/content/12/1/20Example features:Bus stopCycle lanesGraffitiGrass vergeRainStreet lightingTrees
  • 49 participants wore SenseCam for 3 days each,311 accelerometer bouts randomly selected79% of episodes could be coded (n=311)- 14% had no associated image data (n=57)- 3% unsure of activity from images (n=10)- 2% images too dark to code (n=7)Doherty, Intl J Behv Nutr Phys Act 2013 10(22)Wearable cameras contextualisingaccelerometer data
  • Doherty, Intl J Behv Nutr Phys Act 2013 10(22)Wearable cameras with accelerometers
  • 311 episodes coded- 12 PA Compendium categories identified- 114 PA compendium subcategories identified59% outdoors, 39% indoors33% leisure time, 33% transportation, 18% domestic,15% occupational45% episodes non-social, 33% direct social,22% social/not-engagedDoherty, Intl J Behv Nutr Phys Act 2013 10(22)Wearable cameras contextualisingaccelerometer data
  • Doherty, Intl J Behv Nutr Phys Act 2013 10(22)Activity type...Activity type...
  • Doherty, Intl J Behv Nutr Phys Act 2013 10(22)Activity context...Activity context...
  • “… Manual coding of the images is also time-consuming and coding errors can occur … ”150 participants, 7 days of living, 2000images per day = 2.1 million images !!!Kerr 2013; Am J Prev Med 44(3): 290-296How to manage all these images?...
  • Framework based on organising data into eventsHuman memory:“… segmenting ongoing activity into distinct events is important for later memory of thoseactivities …“ (Zacks 2006, Psychology and Aging 21:466-482)Video retrieval:“… automatic shot boundary detection (SBD) is an enabling function for almost all automaticstructuring of video…” (Smeaton 2010, Computer Vision and Image Understanding 114(4):411-418)Wearable accelerometers:“… for comparison with physical activity recommendations, 10-min activity bouts were definedas 10 or more consecutive minutes above the relevant threshold …” (Troiano 2008, Med Sci Sport Exer 40(1):181-8)Early lifelogging:“… continuous recordings need to be segmented into manageable units so that they can beefficiently browsed and indexed …” (Lin 2006, Proc. SPIE MM Content Analysis, Management, and Retrieval)British Heart FoundationHealth Promotion Research GroupWearable camera data management?...
  • Doherty 2013. IEEE Pervasive 12(1); 44-47Processing wearable camera data...
  • Doherty 2013. IEEE Pervasive 12(1); 44-47Processing wearable camera data...
  • Doherty 2013. IEEE Pervasive 12(1); 44-47Processing wearable camera data...
  • Doherty 2011; Memory 19(7):785-795Event segmentation overview...
  • Take WIAMIS / Yahoo! Research slides showing thenuts ‘n bolts of this…Doherty 2011; Memory 19(7):785-795Event segmentation overview...
  • Data divided into training and test sets with thousands of differentcombinations evaluatedFrom groundtruth we noticed:Average of 22 events groundtruthed per dayApproach Recommended:Quick segmentation (sensor values only)Performance:F1 score of 60% against users’ semantic boundariesDoherty 2011; Memory 19(7):785-795Event segmentation – how good is it?...
  • Take Mediamill SVM slides showing the nuts ‘n boltsof this… highlight groundtruth construction, etc.Staudenmayer 2012; MSSE 44(1):S61-67Event identification overview...
  • Take Mediamill SVM slides showing the nuts ‘n boltsof this… highlight groundtruth construction, etc.Staudenmayer 2012; MSSE 44(1):S61-67We could use accelerometer signals...
  • Take Mediamill SVM slides showing the nuts ‘n boltsof this… highlight groundtruth construction, etc.Staudenmayer 2012; MSSE 44(1):S61-67We could use accelerometer signals...
  • Video of image classification using a support vectormachine …Snoek 2009; Fnd Tnd Inf Ret; 2(4):215-322Could also use images too...
  • Take Mediamill SVM slides showing the nuts ‘n boltsof this… highlight groundtruth construction, etc.Doherty 2011; Com Hm Behv 27(5):1948-1958Event identification overview...
  • Doherty; AmJPrevMed 2013;43(5),320– 323http://ajpmonline.wordpress.com/2013/04/15/using-wearable-cameras-in-your-research/When  using  wearable  cameras  in  ethnography,  we  should:  1)  Iden<fy  episodes/ac<vi<es    &    2)  Categorise  their  behavioural  type  &  context