Using the Crowd to Understand and Adapt User Interfaces


Published on

Keynote given at the Engineering Interactive Computing Systems (EICS) conference. June 25, 2013

Abstract: Engineering user interfaces has long implied careful design carried out using formal methods applied by human experts and automated systems. While these methods have advantages, especially for creating interfaces that have the flexibility to adapt to users and situations, they can also be time consuming, expensive, and there are relatively few experts able to apply them effectively. In particular, many engineering methods require the construction of one or more models, each of which can only be created through many hours of work by an expert. In this keynote, I will explore how social and human computation methods can be applied to reduce the barriers to achieving user interface flexibility and ultimately to using engineering methods. In a first example, I will illustrate how groups of users can work together to modify and improve user interfaces through end-user programming examples from the CoScripter and Highlight projects. I will then discuss some initial work on using a crowd of novice workers to create models of existing user interfaces. I hope this keynote will inspire the engineering community to consider alternate approaches that creatively combine formal methods with the power of crowds.

Published in: Technology, Education
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Very happy to be hereMy thesis research and much of the work that I’ve done since my Ph.D. has been in this area, more or less.Followed this conference since the beginning, had the honor of being one of the papers chairs in 2010
  • I recognize that not all engineering of interactive systems requires models, but from looking through the proceedings of EICS, I think that’s most of what this community does.
  • I recognize that not all engineering of interactive systems requires models, but from looking through the proceedings of EICS, I think that’s most of what this community does.
  • I recognize that not all engineering of interactive systems requires models, but from looking through the proceedings of EICS, I think that’s most of what this community does.
  • Charles Babbage was involved with the British Nautical AlmanacMathematical Table Project (or WPA project), begun 1938Calculated tables of mathematical functionsEmployed 450 human computersThe origin of the term computerWhat is new is the easy and affordable access to on-demand labor for tasks like these
  • Thousands, if not millions, of people collecting and categorizing knowledge – could a similar approach be useful for
  • Launched in July 20071M galaxies imaged50M classifications in first year, from 150,000 visitorsFocuses on classification as opposed to manipulation. Has discussion and other features to help new classifiers get up to speed and improve accuracy.
  • FoldIt: protein-folding gameBeyond classification, here crowd workers are playing with and manipulating models to solve real problemsAmateur scientists have found protein configurations that eluded scientists for years
  • Motivation can certainly be a problem with crowdsourcing tasks. One solution is to simply pay people
  • I recognize that not all engineering of interactive systems requires models, but from looking through the proceedings of EICS, I think that’s most of what this community does.
  • JAC: I upped the “Interact” bullet. I think the original ordering is the “systems ordering” on how thing happens, but this new ordering is more natural in understanding what we're doing. The most important fact, that it executes tasks, is on top, and then the clarification is, of course, on which task, etc. I felt it was a bit awkward to point out to the clarification before stating what would Coco need clarification forTL: That's great. My first version had it at the top too.
  • Given a mostly complete understanding of what was possible in the existing interface, we should be able to create a very different new interface that does the same thing, and translate it to the old interface
  • New functionality – SoylentAdapt – CoScripter, Highlight, CoCoUnderstand – CoScripter, intern project
  • Using the Crowd to Understand and Adapt User Interfaces

    1. 1. Using the Crowd to Understandand Adapt User InterfacesJeffrey NicholsIBM Research –
    2. 2. Engineering!
    3. 3. Benefits ofModel-BasedEngineeringDevelop for multiple devicessimultaneously (e.g., web & mobile)Create personalized interfaces• Accessibility (physical, cognitive, etc.)• Previous experienceCompose interfaces fromdifferent components neverbefore used togetherAnd many more…
    4. 4. “... the main shortcoming of the model-basedapproach is that building models is difficult. A richmodel describing all the features of an interface is acomplex object, and hence non-trivial to specify.”Puerta and Szekely, CHI’1994, Vanderdonckt. i-com 2011
    5. 5. Why is building models difficult?Requires humans (mostly)• If no interface exists, must investigate existing practice to determinetasks, etc.• Interpreting and understanding human behavior is hard for computers• Automation may be possible if an interface already existsE.g., [Gimblett & Thimbleby, EICS 2010]Requires experts• Knowledge of abstract concepts and formalisms• Knowledge of specific languages and conventions• Tools may decrease the amount of expertise needed
    6. 6. QuestionsIf we use a crowd…• Do we need models?• Do we need experts to create models?
    7. 7. OutlineBrief overview of relevantcrowdsourcing researchExamples using the Crowd toadapt & understand interfaces• CoScripter• Highlight & CoCo• Crowd-Created ModelsConclusions
    8. 8. CrowdsourcingA Brief Overviewadapted from materials by Michael Bernstein and others
    9. 9. Origin of the term“Taking [...] a function once performed byemployees and outsourcing it to an undefined(and generally large) network of people in theform of an open call.”Jeff HoweWIRED, 2006
    10. 10. Is this concept new?• In 1760, the British NauticalAlmanac distributed work ofcreating navigational chartsthrough postal mail.• Work was computed by twoindependent workers andverified by a third• Process repeated since forother large calculations, e.g.Mathematical Tables Project
    11. 11. Recent Examples…
    12. 12. Paid CrowdsourcingPay small amounts of money for short tasksExample Site:Amazon Mechanical Turk• Roughly five million taskscompleted per year at1-5¢ each [Ipeirotis 2010]• Population: 40% U.S.,40% India, 20% elsewhere• Gender, education andincome are close mirrorsof overall populationdistributions [Ross 2010]
    13. 13. Major Topics of ResearchCrowd algorithms[Little et al., HCOMP 2009]Incentives and Quality[Mason and Watts, HCOMP 2009][Dow et al., CSCW 2012]Crowd-powered systems[Bernstein et al., UIST 2010][Bigham et al., UIST 2010]AI for HCOMP[Dai, Mausam & Weld, AAAI 2010]Complex Work[Kittur et al., UIST 2011]
    14. 14. Major Topics of ResearchCrowd algorithms[Little et al., HCOMP 2009]Incentives and Quality[Mason and Watts, HCOMP 2009][Dow et al., CSCW 2012]Crowd-powered systems[Bernstein et al., UIST 2010][Bigham et al., UIST 2010]AI for HCOMP[Dai, Mausam & Weld, AAAI 2010]Complex Work[Kittur et al., UIST 2011]
    15. 15. Crowdsourcing AlgorithmsEssentially a workflow where each step may beperformed by a different workerIterative algorithms[Little et al. 2009]Digital Assembly LineCrowdFlower
    16. 16. Crowd-Powered SystemsEmbed crowd intelligence inside of user interfaces andapplications we use todayUser Interface Wizard of Oz
    17. 17. Crowd-Powered SystemsEmbed crowd intelligence inside of user interfaces andapplications we use todayWizard of Crowd
    18. 18.
    19. 19. Real-time CrowdsourcingUsing recent techniques, it is now possible to harness crowdworkers to solve tasks in near real-time[Bernstein et al. UIST ’11, Lasecki et al. UIST ’11 and UIST ’12]Example: Real-time captioning using shotgun gene sequencingtechniques [Lasecki et al. UIST ’12]
    20. 20. Legion [Lasecki et al. UIST ’11]
    21. 21. Crowdsourcingwrap-upThere is a lot of power availablein the crowd…How can we harness it to helpengineer new or improvedinterfaces?
    22. 22. CoScripterA Wikipedia-style approach to crowdsourcing task traces
    23. 23. CoScripter:Capture & Reuse Web Tasks• Employees in large enterprisesneed to share “how-to” knowledge• This knowledge is typically keptby a few knowledge hubs• The CoScripter approach:• Capture web tasks by watchingpeople do them• Automate repetitive tasks to save time• Use a natural-language scriptinglanguage for understandability[Leshed et al, CoScripter: Automating & SharingHow-To Knowledge in the Enterprise, CHI 2008]
    24. 24.
    25. 25. CoScripter Key Features• Browser extension for recordingand playback• Wiki for storing, sharing, andcollaboratively improving scripts• Personal database allows use ofsensitive information withinscripts without sharing thatinformation
    26. 26. 30CoScripter AdoptionDeployed inside IBM since Oct 2006– ~4200 users, ~3500 scriptsDeployed on public internet 2007-2012– ~13300 users, ~16,000 scriptsInterviews and analysis showsit addresses pain points– IBMers use it to automaterepetitive tasks and shareprocess knowledge with eachotherTime#scripts
    27. 27. Conclusions• CoScripter provides an inexpensiveand lightweight method to adaptexisting interfaces• Allows a crowd of users to produce aknowledge base of importantinteraction traces• Resulting knowledge base could beused for many purposes• Re-authoring user interfaces fordifferent user groups• Generating models?
    28. 28. Highlight & CoCoadapting existing user interfaces to different contexts
    29. 29. Example: – Flight Tracking
    30. 30. Flight Tracking: MobilizedHighlight, Nichols et al. IUI 2008, UIST 2008
    31. 31. Flight Tracking: SpeechifiedUser: “What is the status for my American Airlinesflight?”System: “What is the flight number?”User: “144”System: “Flight Status – Arrived”CoCo, Lau et al. UIST 2010
    32. 32. Highlight
    33. 33. GoalsAllow end users to create their own mobile“applications” for particular tasks• No programming required• Possible for any existing site• All design decisions made by usersAllow programmers to extend capabilities of mobileapplications
    34. 34. mobile userproxy serverproxy browserweb serverwebserverusermobile app designer(browser extension)Highlight Architecture
    35. 35. How do end users create applications?Highlight Designer• Built using Firefox webbrowser• Allows user to demonstratea “trace” of interaction• Direct manipulation tools• Generalization allowscreation of mobile apps withcomplex structureNichols & Lau, IUI 2008
    36. 36.
    37. 37. How does it work?:Remote Control Metaphor
    38. 38. 144
    39. 39. 144
    40. 40. Remote Control Metaphor DiscussionBenefits• No need to understandunderlying code or describeapplication with complex models• Working at the interactive levellets authors work with what theycan “see.”• Possible for end users, extensibleby programmers• If easy enough, allows users tocreate user interfaces that reflecttheir own needs and abilitiesDrawbacks• Always running original interfacein the background• Constrained by original design• How to communicate thoseconstraints to the author?
    41. 41. CoCo
    42. 42. The CoCo research visionExplore the use of conversational user interfaces to web tasksDesign and build intelligent agents that:• Interact with the web on a users behalf• Converse with the user to clarify meaning• Learn new knowledge over time• Are personalized for a users needsGoal: improve user productivity and increase access toinformation technology through simpler interfaces
    43. 43. Not shown in the talk, but instructive (ACM DL access required):
    44. 44. Alice: punch out 17 30CoCo: Extracted this script from your logs:Go to, enter yourpassword into the textbox, click Go...Run it?Alice: yesCoCo: I dont know what “password” to useAlice: punch out 17 30 using alice00 as passwordCoCo: I will run your script using paramspassword=alice00CoCo: 17:30 Exit
    45. 45. scriptsCoScripterCRH(browser logs)Task knowledge Web automation...Highlightclips
    46. 46. Two paths to determining process:Automatic• System finds existing script in database or infers scriptfrom web history• Content is clipped based on heuristics matching originalcommandManual (“re-authoring”)• User creates a script in CoScripter• Specifies parameters as “personal database” values• Specifies “clip” commands to return information
    47. 47. Conclusions• Relatively simple understanding isused to facilitate substantial changesto the UI• CoCo leverages crowd-generatedscripts. Highlight could haveleveraged crowd similar to CoScripter• Models are used to facilitatere-authoring, but not using a typicalapproach• More robust underlying modelscould lead to more robust results
    48. 48. Crowd-Created ModelsAn initial exploration of using novice crowd workers to produce abstract models
    49. 49. Background• Our initial re-authoring work reliedon interactions being the same ineach platform (Highlight), or a prioriknown transformations (CoCo)• To make deeper changes, we knewthat we would need deeper models• Could the crowd help us build themodels we need?Disclaimer: This is work is initial and incomplete. If youwould like to continue it, please let me know!
    50. 50. Process1. Build domain model• What are the objects that are viewedand manipulated?• What functions and parameters do theyhave?2. Build “task” model linked to thedomain model• Primarily based on object functions3. Collect traces for carrying out tasks• Integrated into Mturk the CoScriptervariant PlayByPlay [Wiltse et al. CHI 2010]
    51. 51. Building a Domain ModelTask• Give a link to web UI• Ask a question• Possibly use taboo words as in ESPgameGoal• Questions designed to elicit nouns orverbs that would correspond toobject or function names• Previously collected terms used inquestions• Iterate between noun & verbquestionsQ: What can you search for?taboo: restaurants, hairdressersQ: What things can you manipulate?taboo: restaurants, hairdressersQ: What can you do with a restaurant?taboo: make reservation, see reviewsQ: What can you make a reservation for?taboo: restaurants, hairdressers
    52. 52. ResultsCrowd workers don’t know abstract terms• E.g., What is an object? What are things you canmanipulate?• Phrasing in terms of concrete activities helps• What can you search for?• What can you do with?Lack of a clear widget model on the web makesinterpreting demonstrations hard• When is a user using a custom component vs. a standardone?
    53. 53. Take-aways• With some cleverness, it should bepossible to use novice crowd workersto construct useful models• It will be a significant undertaking;probably at the level of a Ph.D. thesis• Interns do not have time to completea second thesis during the summer(even good ones!)
    54. 54. Conclusions…
    55. 55. Harnessing the crowd offers tremendous potentialfor engineering interactive systems……to provide new functionality…to adapt UIs to specific use cases or platforms…to understand UIs
    56. 56. Crowd Papers at EICS 2013CrowdStudy: Extensible Toolkit for Crowdsourced Evaluation ofWeb InterfacesMichael Nebeling, Maximilian Speicher, Moira NorrieCrowdAdapt: Enabling Crowdsourced Adaptation of Web Sitesfor Individual Viewing Conditions and PreferencesMichael Nebeling, Maximilian Speicher, Moira NorrieEcho: The Editor’s Wisdom with the Elegance of a MagazineJoshua Hailpern, Bernardo HubermanCrowdsourcing User Interface Adaptations for Minimizing theBloat in Enterprise Applications (poster)Pierre Akiki, Arosha Bandara, Yijun Yu
    57. 57. How can you use the crowd in yourresearch?
    58. 58. Thanks!For more information, contact:Jeffrey
    59. 59. Crowd Resources• WWW 2011 Tutorial by Panos Ipeirotis & PraveenParitosh• Michael Bernstein CS 276 Lecture• Crowd Research blog