Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Agile Science

1,078 views

Published on

Summary talk of the underlying philosophy, guiding principles, targeted behavior change products, and process of agile science for creating, optimizing, repurposing, and curating tools and evidence.

Published in: Science
  • Be the first to comment

  • Be the first to like this

Agile Science

  1. 1. @ehekler, ehekler@gmail.com www.agilescience.org keynote @ #ISRII2017 11 Eric Hekler, PhD Associate Professor, Arizona State University Associate Professor, University of California, San Diego (Dec 2017 onward)
  2. 2. Thank you! 2 @ehekler • Predrag (Pedja) Klasnja, John Harlow, Elizabeth Korinek, Sayali Phatak, Bill Riley, Daniel Rivera, Mathew Buman, Kevin Patrick, Bob Evans, Cesar Martin, Jennifer Huberty, Marios Hadijamichael • Linda Collins & MOST • The Robert Wood Johnson Foundation • DISCLAIMER: I am a scientific advisor to: eEcoSphere, Proof Pilot, & Omada Health, HopeLab, Sage Bionetworks
  3. 3. 3 @ehekler People are different. Context matters. Things change.
  4. 4. Summary 4@ehekler • Goal: – Knowledge accumulation to support behavior change • Problems – Evidence created vs. needed – Complex causal problem vs. simple causal philosophy • “Building blocks” – Modules – Computational models – Decision policies – Tools for personalization • Activities in the process – Creating – Optimizing – Repurposing – Curating
  5. 5. Crisis of methods 5
  6. 6. Crisis of methods 6 “Most scientists are not trained today on the basics of epistemology or logic… We need to go back to work on the basics.” -Dr. Arturo Casadevall Johns Hopkins Bloomberg School of Public Health
  7. 7. @ehekler 77 Supply & demand problem Evidence Demanded Evidence Supplied Yes No Yes Success Problem No Need Success McNie, Parris, Sarewitz, 2016, Research Policy
  8. 8. Patients: What do I do now? 8 https://pixabay.com/p-690128/?no_redirect
  9. 9. Practitioners: What do I do now? 9
  10. 10. Policy-Makers: What do we do now? Harlow, Hekler, Johnston, Yeh, under review@ehekler
  11. 11. What should I (or my client) do now in this context to produce the desired outcome(s)? @ehekler 11
  12. 12. 12 https://www.guideline.gov/summaries/summary/39432/diagnosis-and-treatment-of-depression-in-adults-2012-clinical-practice-guideline
  13. 13. Usable evidence 13 @ehekler “From evidence-based decision-making to decision-based evidence-making.” Margaret Laws, HopeLab
  14. 14. (Plausibly) meaningful variability @ehekler 14 Universal Sub- Group Idiosyncratic Context- bound
  15. 15. User’s needed evidence (based on the question) @ehekler 15 Sub-Group IdiosyncraticContext-bound Universal Sub- Group Evidence generated (based on study designs used) Context- Bound* Idiosyncratic* Universal *Largely considered “noise”
  16. 16. What is causality? How do we infer it? • Cause preceded effect • Cause related to effect • No alternative explanations
  17. 17. Can a cause/effect occur without a human? https://commons.wikimedia.org/wiki/File%3AIf_a_tree_falls_in_the_forest.jpg
  18. 18. Can a cause/effect occur one time only?
  19. 19. Can a cause/effect occur only sometimes?
  20. 20. INUS condition [Preconditions]: Insufficient but Necessary parts of a condition which itself is [Mechanism of action] Unnecessary but Sufficient Mackie, J. L., 1965. “Causes and Conditions”, American Philosophical Quarterly, 12: 245–65.
  21. 21. Pre-conditions • When, where, for whom, and in what state will a given intervention produce the desired outcome. 21 @ehekler Hekler et al. 2016, AJPM
  22. 22. Fundamental mechanism of action 22 @ehekler
  23. 23. @ehekler 2323www.agilescience.org
  24. 24. Efficiency Guiding principles 24 @ehekler Continuous optimization Usability Triangulation
  25. 25. Modules Decision policies Computational models Tools of personalization Agile Science Tools 25 @ehekler
  26. 26. Complex interventions vs. modules From perfect “packages” Flickr - Paul Swansen= To repurposable pieces Flickr - Benjamin Esham @ehekler www.agilescience.org
  27. 27. Complex interventions @ehekler
  28. 28. Modules @ehekler Inputs Process Output
  29. 29. Modularizing health interventions @ehekler Pain Reduction Tool Goal-setting Tool Walking Reminder Tool Social Support Tool Glucose Monitoring Tool Insulin Dosage Tool
  30. 30. Proximal outcomes of the module Shortest timescale for measuring a meaningful effect @ehekler Walk within 30min of prompt Prompt to Walk Steps/ Day National Guidelines (PA/WK) Cardiovascular Fitness (vO2) CVD Proximal Outcomes (often skipped/ignored) www.agilescience.org
  31. 31. Computational models Linking interventions, individuals, context, & outcomes Riley, Martin, Rivera, Hekler, et al. 2016; Martin, Riley, Rivera, Hekler, et al. 2014@ehekler
  32. 32. Decision policies Matchmaking interventions with individual & contextual differences www.netflix.com@ehekler
  33. 33. Martin, Rivera, & Hekler Am. Control Conference (2015) Decision policies @ehekler
  34. 34. Tools of personalization “Learning” adjustments when previous evidence does not match @ehekler Eng. (Adaptive Control) CS (e.g., reinforcement learning)
  35. 35. Tools of personalization “Learning” adjustments when previous evidence does not match @ehekler Measure success towards goal Results Self-experimentation Goal + Plan Implement for 1 week
  36. 36. User’s needed evidence @ehekler 36 Sub-Group IdiosyncraticContext-bound UniversalWhat should I (or my client) do now in this context to produce the desired outcome(s)?
  37. 37. Using evolution as a model for the scientific process. Agile Science process 37 @ehekler www.agilescience.org
  38. 38. Variability generation Natural selection Niche expansion @ehekler Modeling evolution = Create = Optimize = Repurpose
  39. 39. @ehekler
  40. 40. Create Design specific solutions for specific problems @ehekler
  41. 41. 41
  42. 42. 42
  43. 43. 43
  44. 44. Training guide available now! 44 www.agilescience.org/resources.html
  45. 45. Optimize Engineer until success (i.e. optimization criteria) are met. @ehekler
  46. 46. Linda M. Collins The Methodology Center Penn State methodology.psu.edu@ehekler
  47. 47. Optimization trials • Screening experiment • SMART • Micro-randomization trial • Control systems optimization trial 47
  48. 48. Daily step goal & rewards Hekler (PI), Rivera (Co-PI), NSF IIS-1449751 -15 -10 -5 0 5 10 15 20 0 2000 4000 6000 8000 10000 12000 14000 1000 3000 5000 7000 9000 11000 13000 15000 AveChangeSelf Effficacy ActualDailySteps Recommended Goal Actual Steps Δ Self-Efficacy @ehekler 48
  49. 49. Optimization criteria 49@ehekler Hekler et al. under review • Initiation “Set-point” – 10,000 steps/day, on average per week for 22 out of 26 weeks OR – +3,000 steps/day, on average per week relative to baseline for 22 out of 26 weeks • Maintenance set-point – Same steps set-point – 0 interactions with participant, except use of wearable device
  50. 50. Martin, Rivera, & Hekler Am. Control Conference (2015; 2016) Model-predictive controller @ehekler 50
  51. 51. Control engineering optimization trial Open Loop Closed loop> Maintenance 51@ehekler
  52. 52. Repurpose Determine [generalize] who, when, and where else might the tool be useful. @ehekler
  53. 53. INUS condition [Preconditions]: Insufficient but Necessary parts of a condition which Itself [Mechanism of action] is Unnecessary but Sufficient Mackie, J. L., 1965. “Causes and Conditions”, American Philosophical Quarterly, 12: 245–65.
  54. 54. Modularizing 1) Cutting out pre-conditions • When, where, for whom, and in what state will a given intervention produce the desired outcome. 54 @ehekler Hekler et al. 2016, AJPM
  55. 55. 55 @ehekler Modularizing 2) Distill mechanism of action from variations
  56. 56. 56 Modularizing 2) Distill mechanism of action from variations
  57. 57. Science of matching/generalization • Does it remain true across variations among other people, places, times, treatments? • Is it predictive of the future for that same person/unit of study? Shadish, Cook, & Campbell, 2002
  58. 58. Science of matching 58 Intervention Constructs & Operations Optimization Criteria Niche definition
  59. 59. Complexity map 59
  60. 60. Science of matching 60 Meaningful variations In hormone replacement therapy Meaningful Definitions of Success Meaningful clusters of people, place, times (i.e., niches)
  61. 61. Pragmatic clinical trials? 61 • Implementation science • Scaling up and scaling out • Connection? – ACTS? – Others? – LOVE TO HEAR YOUR THOUGHTS!
  62. 62. Curate Organize information to make it accessible for decision-making. @ehekler
  63. 63. The Human Behaviour-Change Project A Collaborative Award funded by the Participating organisations @HBCProject www.humanbehaviourchange.org Adapted from Susan Michie/slides: http://www.ucl.ac.uk/human-behaviour-change
  64. 64. Human Behavior-Change Project computer science information science behavioural science Ontology of behaviour change interventions How can we organise the evidence? Extracting and interpreting the evidence What does the evidence show? Making the evidence accessible at scale in real time How can we make the evidence usable? Adapted from Susan Michie/slides: http://www.ucl.ac.uk/human-behaviour-change
  65. 65. Extracting 65 Image courtesy of Kai Larsen
  66. 66. Organizing 66 Larsen, Michie, Hekler et al. 2017
  67. 67. Using • “The big question” What works, compared with what, how well, with what degree of exposure, for whom, in what settings with what behaviours, and why? 67Adapted from slides from Robert West; http://www.ucl.ac.uk/human-behaviour-change
  68. 68. Summary 68@ehekler • Goal: – Knowledge accumulation to support behavior change • Problems – Evidence created vs. needed – Complex causal problem vs. simple causal philosophy • “Building blocks”: – Modules – Computational models – Decision policies – Tools for personalization • Activities in the process: – Creating – Optimizing – Repurposing – Curating
  69. 69. Open questions 69@ehekler • What does science look like when people are different, context matters, and things change? • What about citizen-led science? • What does a 21st cent. scientist do? – Science of matching – Empower citizens/practitioners • How might funding look different?
  70. 70. @ehekler, ehekler@gmail.com 7070www.agilescience.org Eric Hekler, PhD Associate Professor, Arizona State University Associate Professor, University of California, San Diego (Dec 2017 onward)

×