Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

User Interfaces that Design Themselves: Talk given at Data-Driven Design Day in 2017, Helsinki. Antti Oulasvirta / Aalto University

215 views

Published on

"User Interfaces that Design Themselves": A talk given at Data-Driven Design Day in 2017, Helsinki. Antti Oulasvirta / Aalto University.

Published in: Design
  • Be the first to comment

  • Be the first to like this

User Interfaces that Design Themselves: Talk given at Data-Driven Design Day in 2017, Helsinki. Antti Oulasvirta / Aalto University

  1. 1. User Interfaces that Design Themselves Antti Oulasvirta, with many others Aalto University School of Electrical Engineering Invited talk given for the Data-Driven Design Day 2017 Helsinki, Finland
  2. 2. About the speaker A cognitive scientist leading the User Interfaces group at Aalto University userinterfaces.aalto.fi ...in order to improve user interfaces Modeling joint performance of human-computer interaction ... and developing new principles of design and intelligent support
  3. 3. Recent book
  4. 4. AI and automation: Expectations Saves human effort and increases quality. Scalability: Can be replicated en masse Should be: controllable, appreciate needs, and adapt sensibly
  5. 5. Fears 12.3.2018 5
  6. 6. “In future, data-driven design will possibly reach out to decision making and generative design. But we’re not there yet.” Lassi Liikkanen, blog post, June 2017
  7. 7. Machine learning does not design. Data does not design. So what does?
  8. 8. In this talk… • Data-driven design: A view • Computational approaches • Model-driven design • Generative design • Self-designing interfaces • A vision 12.3.2018 8 15+ examples of recent research Computational design thinking
  9. 9. Did it not fail already? Microsoft ‘Clippy’ Grid.io (AI for web design)
  10. 10. Data-driven design: A computationalist’s view
  11. 11. Design as problem- solving John Carroll; Nigel Cross; Stuart Card, Tom Moran, Allan Newell Generate Evaluate Define
  12. 12. Usability engineering Sets measurable objectives for design and evaluation. Acquires valid and realistic feedback from controlled studies Low cost-efficiency Designers easily get fixated on one solution (Greenberg & Buxton 2008)
  13. 13. Rapid sketching Accelerates the rate of idea-generation Depends on experience and a-ha moments Less contact with data
  14. 14. Rapid prototyping Decreases the idea-to-evaluation cost Axure
  15. 15. Online testing (A/B and MVT) Decreases the per-subject cost of evaluation Limited to partial features and may ‘contaminate’ users Amazon
  16. 16. User analytics Reduces time and improved quality of evaluation and definition Predefined views cause myopia 12.3.2018 16 Google Analytics
  17. 17. Data-driven design A combination of analytics, rapid ideation, and online testing improves cost-efficiency of a design project. Increases reliance on data not opinions.
  18. 18. Data-driven design http://www.uxforthemasses.com/data-informed-design/
  19. 19. Data Brain-driven design Interpretations, definitions, and evaluations: do not follow from data. Reliance on human effort makes design costly Susceptible to biases and relies on creative insight and pure luck Generate Evaluate Define
  20. 20. Model-driven design and the psychologist’s fallacy 12.3.2018 20
  21. 21. Model-driven design Use models from behavioral sciences to automate Evaluation Generate Evaluate Define
  22. 22. An idea invented multiple times… August Dvorak Herbert Simon Stuart Card
  23. 23. Project Ernestine
  24. 24. Model-based evaluation Predicts task performance with no empirical data. Tedious to adapt. Humans still generate designs 12.3.2018 24 Glean/EPIC
  25. 25. Visual importance prediction Deep learning predicts visual importance of elements on a page 12.3.2018 27 USA { her t zman, br ussel l } @adobe.com Bylinskii et al. UIST 2017
  26. 26. Bounded agent simulators Predict emergence of interactive behavior as a response to design (e.g., reinforcement learning) 12.3.2018 28 Jokinen et al. CHI 2017
  27. 27. Bounded agent simulators: Result Predict emergence of interactive behavior as a response to design (e.g., reinforcement learning) 12.3.2018 29 Jokinen et al. CHI 2017
  28. 28. Models integrated in prototyping tools Integration of models into design tools lower the costs of modeling. 12.3.2018 30 CogTool
  29. 29. Multi-model evaluation (VIDEO) 12.3.2018 31 Aalto Computational Metrics Service (in progress)
  30. 30. Assessment: Model-informed design Reduces evaluation costs and improve designs Increases understanding of design Models don’t design! Modeling is tedious and requires special expertise 12.3.2018 32
  31. 31. Computational design 12.3.2018 33
  32. 32. Computational design = applies computational thinking - abstraction, automation, and analysis - to explain and enhance interaction. It is underpinned by modelling which admits formal reasoning and involves: 1. a way of updating a model with data; 2. synthesisation or adaptation of design 12.3.2018 35 Oulasvirta, Kristensson, Howes, Bi. 2018 OUP
  33. 33. Many algorithmic approaches Rule-based systems State machines Logic/theorem proving Combinatorial optimization Continuous optimization Agent-based modeling Reinforcement learning Bayesian inference Neural networks 12.3.2018 36
  34. 34. The computational design problem Computational design rests on 1. search (of d), 2. inference (of θ), 3. prediction (of g) 12.3.2018 37 Find the design (d) out of candidate set (D) that maximizes goodness (g) for given conditions (θ):
  35. 35. Implications Design is hard because the math is hard • Regular design problems have exponential design spaces and many contradictory objectives Only computational approaches offer 1) high probability of finding the best design and 2) design rationale 12.3.2018 38
  36. 36. UI description languages Port a design from one terminal to another. Heavy. Overly case-specific. 12.3.2018 39Eisenstein et al. IUI 2001
  37. 37. Probabilistic design generation A statistical model of features is formed and used to generate designs conforming to the domain. Poor transferability 12.3.2018 40 9 Image Parsing via Stochastic Scene Grammar Yibiao Zhao, Song-Chun Zhu 1 Department of Statistics University of California, Los Angeles Los Angeles, CA 90095, Department of Statistics and Computer Science University of California, Los Angeles Los Angeles, CA 90095 Introduction This paper proposes a parsing algorithm for indoor scene understanding which includes four aspects: computing 3D scene layout, detecting 3Dobjects (e.g. furniture), detecting 2D faces (windows, doors etc.), and segmenting the background. The algorithm parse an image into a hierarchical structure, namely a parse tree. With the parse tree, we reconstruct the original image by the appearance of line segments, and we further recover the 3Dscene by the geometry of 3D background and foreground objects. 3D synthesis of novel views based on the parse tree Results Stochastic SceneGrammar The grammar represents compositional structures of visual entities, which includes three types of production rules and two types of contextual relations: • Production rules: (i) AND rules represent the decomposition of an entity into sub-parts; (ii) SET rules represent an ensemble of visual entities; (iii) OR rules represent the switching among sub-types of an entity. • Contextual relations: (a) Cooperative + relations represent positive links between binding entities, such as hinged faces of a object or aligned boxes; (b) Competitive - relations represents negative links between competing entities, such as mutually exclusive boxes. BayesianFormulation We define a posterior distribution for a solution (a parse tree) pt conditioned on an image I. This distribution is specified in terms of the statistics defined over the derivation of production rules. P(pt|I) / P(pt)P(I|pt) = P(S) Y v2 Vn P(Chv|v) Y v2 VT P(I|v) (1) The probability is defined on the Gibbs distribution: and the energy termis decomposed as three potentials: E(pt|I) = X v2 VOR EOR (Ar(Chv)) + X v2 VAND EAND (AG(Chv)) + X ⇤ v2 ⇤ I,v2 VT ET (I(⇤ v)) (2) InferencebyHierarchical Cluster Sampling We design an efficient MCMC inference algorithm, namely Hierarchical cluster sampling, to search in the large solution space of scene configurations. The algorithm has two stages: • Clustering: It forms all possible higher-level structures (clusters) from lower-level entities by production rules and contextual relations. P+ (Cl|I) = Y v2 ClOR POR (Ar(v)) Y u,v2 ClAND PAND + (AG(u),AG(v)) Y v2 ClT PT (I(Av)) (3) • Sampling: It jumps between alternative structures (clusters) in each layer of the hierarchy to find the most probable configuration (represented by a parse tree). Q(pt⇤ |pt,I) = P+ (Cl⇤ |I) Y u2 ClAND,v2 ptAND PAND - (AG(u)|AG(v)). (4) Experiment andConclusion Segmentation precision compared with Hoiem et al. 2007 [1], Hedau et al. 2009 [2], Wang et al. 2010 [3] and Lee et al. 2010 [4] in the UIUC dataset [2]. Compared with other algorithms, our contributions are • AStochastic Scene Grammar (SSG) to represent the hierarchical structure of visual entities; • AHierarchical Cluster Sampling algorithm to performfast inference in the SSG model; • Richer structures obtained by exploring richer contextual relations. Website: http://www.stat.ucla.edu/ ybzhao/research/sceneparsing (b) Our result TEMP LATE DESIG N © 2008 www.PosterPresentations.com Image Parsing via Stochastic Scene Grammar Yibiao Zhao and Song-Chun Zhu University of California, Los Angeles Introduction This paper proposes a parsing algorithm for indoor scene understanding which includes four aspects: computing 3D scene layout, detecting 3D objects (e.g. furniture), detecting 2D faces (windows, doors etc.), and segmenting the background. The algorithm parse an image into a hierarchical structure, namely a parse tree. With the parse tree, we reconstruct the original image by the appearance of line segments, and we further recover the 3D scene by the geometry of 3D background and foreground objects. Bayesian Formulation o Sampling: It jumps between alternative structures (clusters) in each layer of the hierarchy to find the most probable configuration (represented by a parse tree). Stochastic Scene Grammar The grammar represents compositional structures of visual entities, which includes three types of production rules and two types of contextual relations: o Production rules: (i) AND rules represent the decomposition of an entity into sub-parts; (ii) SET rules represent an ensemble of visual entities; (iii) OR rules represent the switching among sub-types of an entity. o Contextual relations: between binding entities, such as hinged faces of a object or aligned boxes; (b) such as mutually exclusive boxes. Results Inference by Hierarchical Cluster Sampling Website: http://www.stat.ucla.edu/~ybzhao/research/sceneparsing 3D synthesis of novel views based on the parse tree The probability is defined on the Gibbs distribution: and the energy term is decomposed as three potentials: We define a posterior distribution for a solution (a parse tree) pt conditioned on an image I. This distribution is specified in terms of the statistics defined over the derivation of production rules. We design an efficient MCMC inference algorithm, namely Hierarchical cluster sampling, to search in the large solution space of scene configurations. The algorithm has two stages: o Clustering: It forms all possible higher-level structures (clusters) from lower- level entities by production rules and contextual relations. Experiment and Conclusion Segmentation precision compared with Hoiem et al. 2007 [1], Hedau et al. 2009 [2], Wang et al. 2010 [3] and Lee et al. 2010 [4] in the UIUC dataset [2]. Compared with other algorithms, our contributions are o A Stochastic Scene Grammar (SSG) to represent the hierarchical structure of visual entities; o A Hierarchical Cluster Sampling algorithm to perform fast inference in the SSG model; o Richer structures obtained by exploring richer contextual relations. (c) Original poster[35] Qiang et al. arXiv 2017
  38. 38. Heuristic design exploration Alternative layouts can be proposed to the designer. Lots of low quality suggestions 12.3.2018 41 DesignScape
  39. 39. Model-based UI optimization Models allow the computer to predict ‘good design’. Costly to set up optimization systems. 12.3.2018 42MenuOptimizer Bailly et al. UIST’13
  40. 40. Interactive optimization Algorithms can nudge designers toward better designs and more diverse thinking. 12.3.2018 47
  41. 41. Assessment: Algorithms that design Can generate good designs and show viable alternatives Interactive steering of the optimizer Only for initializing a design, not for updates/redesigns Limited scope (yet) Expensive to set up optimization systems No contact with user data 12.3.2018 48
  42. 42. Self-designing interfaces 12.3.2018 49
  43. 43. Self-designing interfaces Interfaces that initialize and update themselves as more data comes in… Rests on three pillars: 1. Inference (from data to models) 2. Search (from models to designs) 3. Prediction (from designs to people) Generat e Evaluat e Define
  44. 44. Recommendation engines Recommend content based on inferred interest or preference. Only changes an isolated slot in the UI. 12.3.2018 51
  45. 45. ‘Coadaptive user interfaces’ Interfaces that can fully reconfigure themselves taking into account how users will react to changes Requirements: 1. Models able to predict the effects of its actions: what happens when design changes 2. Algorithms operate in run-time 3. Fast inference of model parameters from empirical data 12.3.2018 52
  46. 46. Ability-based optimization When model parameters can be obtained for a user group, an optimizer can differentiate designs. How to obtain parameters? 12.3.2018 53 Jokinen et al. IEEE Pervas Comp
  47. 47. “Inverse modeling” Model parameters (θ) can be inferred from log data only using Bayesian optimization. Costly to run. 12.3.2018 54 Figure1. Thispaper studiesmethodology for inferenceof parameters of cognitivemodels from observational data in HCI. At thebottom of thefigure, wehavebehavioral data (orangehistograms), such astimes and targets of menu selections. At the top of the figure, a cognitive model generates simulated interaction data (blue histograms). In this models, it requiresnopredefinedspecification of the task solution, only theobjectives. Giventhose, andt straintsof thesituation, wecanusemachinelearning theoptimal behavior policy. However, achievingthei that isinferringtheconstraintsassumingthat thebeha optimal, isexceedingly difficult. Theassumptionsabo quality and granularity of previously explored meth thisinversereinforcement learningproblem[32, 39, 4 tobeunreasonablewhenoftenonly noisy or aggregat dataexists, suchasisoftenthecaseinHCI studies. Our application caseisarecent model of menu inter [13]. Themodel studiedherehaspreviously capture tationof searchbehavior, andconsequently changes completiontimes, invarioussituations[13]. Themodel parametricassumptionsabout theuser, for exampleab visual system (e.g., fixation durations), and uses rei ment learning to obtain abehavioral strategy suitabl particular menu. Theinverseproblem westudy is Kangasrääsiö et al. Proc. CHI 2017
  48. 48. Co-adaptation Model-based optimizers learn parameters from data. Use models to predict the effects changes the UI 12.3.2018 55
  49. 49. Browser-side ‘familiarization’1.Most-Encountered 2.Serial Position Curve 3.Visual Statistical Learning History earning 4.GenerativeModel of Positional Learning Original
  50. 50. Server-side assortment UI optimization 12.3.2018 57 Objectives: Visual search time Exploratory behavior Design space: Swaps, replaces Contact with data: Attention/click data
  51. 51. Demo
  52. 52. Assessment: Self-designing UIs Initialization AND updating Graceful (conservative) evolution with data Still opaque systems: designers have little control (No evidence yet that this works) 12.3.2018 59
  53. 53. Conclusion 12.3.2018 60
  54. 54. Why computational design? 1. Increase efficiency, enjoyability, and robustness of interaction 2. Proofs and guarantees for designs 3. Boost user-centered design 4. Reduce design time of interfaces 5. Free up designers to be creative 6. Harness new technologies quicker
  55. 55. Computational Design Stool 12.3.2018 62 Search Inference Prediction
  56. 56. Predictions that I like 1. Resources emerge that lower the entry barrier 2. Organizations that use computational design win competition 3. Collaborative AI helps novice designers to design better 4. Users start demanding algorithmic UIs 5. Some commonly used UIs become coadaptive 6. Interaction design as a field embraces computation 12.3.2018 63
  57. 57. Hire top students from our classes SC5 workshop on ‘Designing with Models and Algorithms’ Check out our book Computational Interaction (Oxford U Press 2018) Email me for pointers: antti.oulasvirta@aalto.fi Send research topics to our course Join our CHI course or summer school

×