Technology and Health Promotion: Implications for Clinical Psychology Practice

1,304 views

Published on

In this talk, given to clinical psychologists, I discuss trends in technology and how they will likely impact clinical practice.

0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,304
On SlideShare
0
From Embeds
0
Number of Embeds
242
Actions
Shares
0
Downloads
20
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide
  • Discuss the lack of understanding from behavioral scientists on how to really deal with big data and opportunities for setting up “in the wild” studies that could later be harnessed for A/B testing. Nice melding of behavioral science knowledge of randomized controlled trials and HCI’s knowledge on the systems to automate those types of systems in the real-world.
  • Did a bit of ‘spreading’
  • Did a bit of ‘spreading’
  • Did a bit of ‘spreading’
  • To demonstrate this, let me first describe two models for developing interventions. On the one side, there is the “top-down” approach where content is generated by experts and experts evaluate the information. This is the model that I have been working under and I believe most scientists think about when developing interventions. The strength to this method is that techniques are rigorous but it is also slow. In contrast, another model of intervention development could be called “crowd-sourcing”. In a crowd-sourced solution content is generated by the crowd and evaluated by the crowd. This of course creates a very fast system but it is difficult to know if the information is accurate or the intervention useful. This model, however, is increasingly gaining favor among technologists and thus it is something we scientists must be mindful of. For example, who uses Encyclopedia Britannica’s website? OK, who uses Wikipedia? Wikipedia is powerful because it is crowd-sourced and we must be mindful of that if we want to find way to get technology to use behavioral science.
  • NOTE, this current draft is just to get a sense of timing and flow on key points to discuss. Formatting on almost all slides will not remain (e.g., likely will NOT have the titles at the top like that).
  • Science has been a very thoughtful and deliberative process.We move slowly to be “certain” we know something.We are moving so slowly, however, that we are making ourselves obsolete.Take for example the pace of science. Here is a typical timeline for a large NIH-funded grant (the gold-standard for health researchers).Compare this to the pace of technology companies moving.We need to do better and currently, our old ways of thinking about behavior change, including our old theories are frankly, not up to snuff to the challenges and opportunities that mHealth technologies allow us.
  • In particular, when I was at Stanford, I pitched an idea of developing theoretically meaniungful smartphone apps to increase physical activity and decrease sedentary behavior via the passive tracking of physical activity and sedentary behavior using the phone’s accelerometer to provide feedback to support behavior change. My mentor, Abby King, liked the idea so I organized a team and we ended up submitting and receiving an ARRA Stimulus grant to build this out.
  • Beyond the common elements, there are also unique elements for the three active applications, as identified here. The key idea study was to parse apart different ways to frame the information about physical activity and sedentary behavior. Rather than labor through this chart, I’m going to show you images from each of the applications to help you get a sense of how they are similar and different.
  • We have been exploring this in the MILES project, which stands for Mobile interventions for lifestyle exercise at Stanford. As stated earlier, the study is an NIH-funded challenge grant. We are currently finalizing development of our three applications. We plan to start our pilot study in January 2011.
  • We have been exploring this in the MILES project, which stands for Mobile interventions for lifestyle exercise at Stanford. As stated earlier, the study is an NIH-funded challenge grant. We are currently finalizing development of our three applications. We plan to start our pilot study in January 2011.
  • We have been exploring this in the MILES project, which stands for Mobile interventions for lifestyle exercise at Stanford. As stated earlier, the study is an NIH-funded challenge grant. We are currently finalizing development of our three applications. We plan to start our pilot study in January 2011.
  • Following this discussion on this, we started to think more deeply about the entire development model and, after several iterations, we landed on nine core questions, that can be chunked into three broad domains of defining the problem, designing the technology, and then determining if it works. While this might seem obvious, it is interesting that from a discplinary silo approach, many of these points are often not really properly considered. To further flesh this out. We’ve identified these questions for defiing the problem. As part of this, we’ve also articulated in our paper that is currently under review, a variety of methods from a variety discipline to help answer these questions. For designing the technology, we believe these are the important questions, and then finally, for determining if it works, we are interested in these questions.
  • First, here are the three “glance-able” displays for the applications. Although the information gathered is identical, minutes engaged in sedentary behavior and MVPA, the way we are displaying it is quite different in each app. For the cognitive app, we wanted to frame the information relative to goals as this model assumes that behavior change occurs through active goal-setting and problem-solving through an active “cognitive” process. For the “affect” app, we utilizing a bird “avatar” as the method of tracking your activity. In this app, as you are more active, the bird flies faster, is happier, and becomes more playful. The idea here is that we believe a person would map the bird’s mood, particularly as it feels happier to their own mood and thus create a link up between being more active and feeling better. Finally, for the social app, you will notice that there are multiple stick figures on the home screen. With this design, the idea here is that a person will be motivated to be more active based on the level of activity of other participants in the study via social norm motivations. These glance-able displays set up the differences between the three apps but now I’m going to show you some more specific elements in each.
  • Technology and Health Promotion: Implications for Clinical Psychology Practice

    1. 1. New Technologies for Health Promotion Implications for Clinical Practice Dr. Eric Hekler Assistant Prof, Nutrition & Health Promotion Arizona State University ehekler@asu.edu @ehekler www.designinghealth.org Presentation for University of Arizona Psychology and Technology: Engaging your Clients Workshop Flickr-RansomTech
    2. 2. Health is complex… Individual characteristics Over the lifespan
    3. 3. Digital technologies are… Pervasive Interconnected Driven by algorithms Flickr – Stuck in Customs
    4. 4. Outline Digital Technologies – What has changed? What does this mean for evidence-based practice? Where are these technologies going?
    5. 5. Outline Digital Technologies – What has changed? What does this mean for evidence-based practice? Where are these technologies going?
    6. 6. Cell phones are pervasive, personal, and powerful Flickr – lestaylorphoto
    7. 7. The digital “cloud”/ internet of things has come!
    8. 8. The digital “cloud”/ internet of things has come!
    9. 9. The digital “cloud”/ internet of things has come!
    10. 10. Passive Sensing Matthew Buman Activity/Sleep Validation Geotagging Matt Buman, ASU; Max Utter, Jawbone David Mohr, Northwestern
    11. 11. We are digitally connected Flickr – tychay
    12. 12. Improving Online Support Groups David McDonald Erika Poole
    13. 13. The process of creation is changing • Expert-sourced • Crowd-sourced o Content generated by “experts” o Content generated by “crowd” o Information evaluated by experts o Information evaluated by crowd o Rigorous but slow o Fast but inaccurate? o www.britannica.com o www.wikipedia.org Flickr – miyagisan
    14. 14. Quantified Self
    15. 15. A DIY Self-Experimentation Toolkit for Behavior Change Win Burleson Eric Hekler Winslow Burleson Jisoo Lee Arizona State University Bob Evans Google
    16. 16. Algorithm=useful data-driven insight Flickr – Aray Chen
    17. 17. “2024” “2014”
    18. 18. What you need to know… • How our interventions work is changing – Other ways to provide interventions – Multi-level/multi-component interventions • Psychologist’s role is going to change – Stepped care approach? – Sense-maker of the data deluge? • “Evidence-based” definition is changing – New methods for knowing what “works” – New focus on algorithms
    19. 19. Outline Digital Technologies – What has changed? What does this mean for evidence-based practice? Where are these technologies going?
    20. 20. 500,000th App Accepted on App Store 2005 2006 Conceive of a study Gather Pilot Data 2007 Submit Grant 2008 2009 2010 2011 2012 Conduct the study Receive Funding Submit publications for review Flickr – Metrix X
    21. 21. MILES Study • Develop theoretically meaningful smartphone apps for midlife & older adults • Physical activity & behavior sedentary • Passively assess PA & SB • Feedback for behavior change Abby King
    22. 22. Components study arms mConnec mTrack mSmiles t Calorific Push component X X X X Pull component X X X X "Glance-able" display X X X X Passive activity assessment X X X X Real-time feedback X X X X Self-monitoring X X X X “Help” tab X X X X Goal-setting X X Feedback about goals X X Problem-solving X X Reinforcement X X X Variable reinforcement schedule X X Attachment X "Play" X "Jack pot" random reinforcement X Hekler et al. 2011, Personal Informatics Workshop at CHI – Design paper Social norm comparison X King, Hekler, et al. 2013 PLoS One, King, Hekler, et al. Manuscript in Preparation
    23. 23. min/week of activity at study completion Physical Activity - 8wk Results 300 250 200 150 Brisk walking (min/week) 100 50 MVPA (min/week) 0 Analytic Social Smartphone Apps Affect Paired t [60] = 5.3, p <0.0001 King, Hekler, et al. 2013 PLoS One,
    24. 24. ∆ Food Consumption 15 † Servings pre day 10 5 ** ** † ** Physical Activity Apps Control 0 -5 Processed Sweets Fatty MeatsFatty Foods Dairy VegetablesFruits -10 Hekler, et al., Manuscript in Preparation. Diet-tracking Intervention App
    25. 25. Lessons learned… • Activity monitoring by phone only is difficult • Each intervention had merit, not potent • “Right” intervention at “right” time and place? • RCT experimental design not enough…
    26. 26. Nine Questions for Intervention Development DEFINE THE PROBLEM 1) What is the problem you are targeting? 2) What influences the problem? 3) How can we help tackle the problem? DESIGN THE INTERVENTION 4) How can technology support? 5) How should individual features work? 6) How is the intervention used and experienced? DETERMINE IF IT WORKS 7) How are components working to impact the problem? 8) Is the intervention overall producing the desired outcome? 9) How can the intervention be made self-sustainable? Klasnja, Hekler, Froehlich, & Buman, Manuscript Submitted for Publication
    27. 27. Beyond the RCT Flickr – kerolic
    28. 28. MOST Linda Collins, methodology.psu.edu
    29. 29. S.M.A.R.T. Trial Slide from U. of Michigan, https://community.isr.umich.edu/public/jitai/Workshop.aspx
    30. 30. Single Case Experimental Design J. Dallery et al. 2013 Journal of Medical Internet Research, 15(2):e22
    31. 31. Current smartphone apps…
    32. 32. C.E.E.B.I.T Mohr et al. 2013 Am J Prev Med, 45(4):517-523.
    33. 33. Outline Digital Technologies – What has changed? What does this mean for evidence-based practice? Where are these technologies going?
    34. 34. Just in Time Adaptive mHealth Interventions Daniel Rivera Co-PI: Daniel Rivera, ASU Other Collaborators: Matthew Buman, Marc Adams, & Pedrag
    35. 35. Social Cognitive Theory Dynamical Model Martin, Riley, Rivera, Hekler, et al. in press
    36. 36. Dynamical Model Simulations Riley, Martin, Rivera, Hekler, et al. Manuscript Submmited for Publication
    37. 37. Secondary Analysis – System Identification Hekler, Buman, Rivera, et al, 2013, Health Education & Behavior.
    38. 38. Microrandomization studies 14000 12000 Steps per Day 8000 Week Average Intervention 4 6000 Intervention 3 Intervention 2 Intervention 1 4000 Measurement 2000 0 1 10 19 28 37 46 55 64 73 82 91 100 109 118 127 136 145 154 163 172 181 190 199 208 217 226 235 244 Steps per day 10000 Days
    39. 39. Just in Time Adaptive mHealth Intervention Daniel Rivera Co-PI: Daniel Rivera, ASU Other Collaborators: Matthew Buman, Marc Adams, & Pedrag
    40. 40. Other Adaptive Interventions SMS Adaptive Intervention Marc Adams Reinforcement Learning (CS version) JITAI Susan Murphy Image and concept from U of Michigan (Murphy, Tewarli et al and Pedja Klasnja Ambuj Tewar presented athttps://community.isr.umich.edu/public/jitai/Workshop.aspx
    41. 41. IBM’s Deep Blue http://www.diabetespreventionsource.com/ IBM’s Watson Timothy Bickmore, Northeastern
    42. 42. Digital Tech is becoming increasingly powerful. It will change your practice… be ready. Flickr-RansomTech
    43. 43. Open Questions… • What will the role of a Psychologist be in the future? – Stepped care approach? – Sense-maker of the data deluge? • What will interventions look like in the future? • What can we do to ensure this technology does not become evil?
    44. 44. Thank you! Flickr – veo_ For these slides visit: www.designinghealth.org ehekler@asu.edu @ehekler

    ×