Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

RecSys 2016 Talk: Feature Selection For Human Recommenders

977 views

Published on

Slides from the talk I gave at the Recommender Systems 2016 Conference in Boston.

Published in: Technology

RecSys 2016 Talk: Feature Selection For Human Recommenders

  1. 1. Proprietary and confidential
  2. 2. Proprietary and confidential
  3. 3. Proprietary and confidential
  4. 4. Proprietary and confidential Human Computation At Stitch Fix
  5. 5. Proprietary and confidential Heavy and repetitive computation Large-scale working memory Large-scale long-term memory Context sensitivity/nuance Aesthetic judgements Relationship building Novel inferences Unstructured data
  6. 6. Proprietary and confidential Processes information to make recommendations Can specify internal mechanisms Can specify the data being used Recommendations improve with better features (data) Needs to be trained and tuned Comes with internal mechanisms Can consider the entire world
  7. 7. Proprietary and confidential Processes information to make recommendations Can specify internal mechanisms Can specify the data being used Recommendations improve with better features (data) Needs to be trained and tuned Comes with internal mechanisms Can consider the entire world
  8. 8. Proprietary and confidential Determine what they’re processing Determine what they should be processing Change/shape what they’re processing
  9. 9. Proprietary and confidential Determine what they’re processing Determine what they should be processing Change/shape what they’re processing Make more recommendations Deliver those recommendations Receive feedback
  10. 10. Proprietary and confidential 1: Determining what they’re processing
  11. 11. Proprietary and confidential If someone isn’t attending to something, but you’re showing it anyways you might ■ Make your worker less efficient (slower) ■ Fatigue them (unnecessary filtering) ■ Lose opportunities for including something more useful Figure out what your human workers are attending to while they make their recommendations If they aren’t attending to a feature, then they’re not making recommendations off of it
  12. 12. Proprietary and confidential Exploration
  13. 13. Proprietary and confidential ‘Online’ Observations What you get ○ Ability to reduce the hypothesis space ○ Higher granularity observations ○ Time-dependent observations (when is something considered)
  14. 14. Proprietary and confidential Mouse Tracking Cheap measure of attention Non-invasive Easy widespread deployment
  15. 15. Proprietary and confidential Eye Tracking
  16. 16. Proprietary and confidential [Visual] search patterns lie somewhere between random and systematic…. humans will attempt a more systematic search, but will still suffer from imperfect memory. (Nickles et al., 2003)
  17. 17. Proprietary and confidential Eye Tracking Resistant to strategy Deterministic Higher accuracy
  18. 18. Proprietary and confidential AREA OF INTEREST (AOI) Eye Tracking Resistant to strategy Deterministic Higher accuracy
  19. 19. Proprietary and confidential
  20. 20. Proprietary and confidential Features You Want To Select!
  21. 21. Proprietary and confidential 2: Determining what they should be processing
  22. 22. Proprietary and confidential You’re interested in overall performance and can optimize for whatever is most important to you ■ True hits, false positives, false negatives ■ Processing time Given the features that they’re using, which ones produce the best recommendations?
  23. 23. Proprietary and confidential The Logic: ○ Workers may vary in what features they use ○ Look for correlations between attention to features and positive metrics Allows you to learn the optimal features amongst your current candidates
  24. 24. Proprietary and confidential Feature Drop Out Studies
  25. 25. Proprietary and confidential Feature Drop Out Studies A/B
  26. 26. Proprietary and confidential Feature Drop Out Studies Logic Show a feature to one cell, and remove it for another If a positive difference in performance is observed, then that feature promotes better outcomes
  27. 27. Proprietary and confidential Feature Drop Out Studies Optimal Conditions A highly controlled “offline” environment ○ Allows for true participant randomization ○ Allows for repeated measures ○ Allows for high “internal validity”
  28. 28. Proprietary and confidential Task-relevant background information (optional) Ability to provide a response - track accuracy, RT, confidence, etc. Trial-specific stimuli - use historic data with known outcomes
  29. 29. Proprietary and confidential Correct ~ Condition + (1|participant_id) Condition differences Feature promotes better recommendations Feature either isn’t considered or makes no difference to recommendations if it is No condition differences
  30. 30. Proprietary and confidential Further Use Of ‘Online’ Observations What you get ○ Ability to determine whether there are certain times at which certain features are beneficial ○ Ability to figure out how information is searched for
  31. 31. Proprietary and confidential -Status: loved -Department: top -Color: purple -Status: loved -Department: dress -Color: green -Status: hated -Department: pants -Color: orange -Status: ... -Department: ... -Color: ... Start with a study to determine correlations
  32. 32. Proprietary and confidential Multiple metrics possible ■ Overall trajectories (http://www.eyetracking-r.com/) ■ Saccade patterns ■ Fixation times and locations Correlate with success
  33. 33. Proprietary and confidential correct ~ fixated_on_loves + fixated_on_color_matches + … + (1|participant_id) … Factors predict success Attention to features may promote better recommendations Attention to features may make no difference to recommendations Factors don’t predict success
  34. 34. Proprietary and confidential correct ~ condition + … + (1|participant_id) Follow up with a full experiment to determine whether the behavior actually causes better recommendations Manipulation congruent with ‘positive’ behaviors
  35. 35. Proprietary and confidential 3: Shaping What They’re Processing
  36. 36. Proprietary and confidential Controlled Lab Study Full A/B Test
  37. 37. Proprietary and confidential Stitch Fix’s “Styling Lab” Full A/B Test in the live styling environment
  38. 38. Proprietary and confidential Behavior Shaping : Humans :: Tuning : Computers Algorithms Can be “in the moment” ● UX Changes ● Directed Attention Can be more sustained ● Training
  39. 39. Proprietary and confidential Change how the information is displayed - exploit human perception (consult UX)
  40. 40. Proprietary and confidential Testing ● Create questions relevant to what you want to train ● Have participants complete them ● Use IRT to determine question difficulty Training ● Order questions by difficulty ● Have those being trained complete them in that order ● Given feedback on performance along the way ● Reinforce key concepts Experimental Approach!
  41. 41. Proprietary and confidential This approach is grounded in Cognitive research! Progressive Alignment prescribes giving people tasks that they’re more likely to succeed at, then progressively making those tasks harder .02 .08
  42. 42. Proprietary and confidential Processes information to make recommendations Can specify internal mechanisms Can specify the data being used Recommendations improve with better features (data) Needs to be trained and tuned Comes with internal mechanisms Can consider the entire world
  43. 43. Proprietary and confidential Questions?

×