The document summarizes key points from talks at The UX Conference in September 2018. It discusses challenges with designing conversational AI interfaces, including unpredictability due to probability, attribution of human characteristics, and bias in training data. It also covers best practices for UX research and design, such as understanding user needs versus wants, stakeholder engagement, multidisciplinary teams, and embedding research in the design process. Methods for rapidly iterating designs are presented from BBC and DICE, including frequent user testing of live products and ongoing testing of games from paper prototypes to post-release.
2. The UX Conference Sept 2018
“Bringing Designers and UX Researchers Together”
● http://theuxconf.com/
● Run by The School of UX Design (https://schoolofux.com/)
● Some videos from this and previous conferences available at:
https://www.youtube.com/channel/UCBJ13f36Crnj5rWEyXCCp4A
3. The Talks
● AI & Ethics: Expanding the Role of UX Research (Microsoft)
● Presenting Research Findings (Dropbox)
● The Process is the Work (Facebook)
● Eye Tracking (tobii)
● One Researcher, Eleven Designers (BBC)
● Co-Design Using Games in Healthcare (NHS)
● Going Beyond Empathy (McLaren)
● Empathic Design for Conversational Interfaces (Google)
● Designing for Global Travellers (hotels.com)
● UX Research in Gaming (DICE)
4. AI Interfaces (1)
Focus on “conversational” interfaces:
● Google Home
● Alexa
● Siri
● ...
Different from conventional UX:
● Probabilistic rather than deterministic - interaction is unpredictable
● They “talk” - users attribute human-like characteristics to them
○ intelligence, emotion, hidden agenda ...
5. AI Interfaces (2)
Unpredictability as a result of probability. E.g:
● Train system using Twitter
● Single question > multiple answers
● System chooses answer with highest probability of being “correct”
Challenges:
● Bias
● Unhelpful
● Blandness
● Susceptible to small data changes
6. AI Interfaces (3)
Attribution of human characteristics
● I can converse with it - it must
be intelligent / understand
● Users assume capabilities
which don’t exist
● Trust in system is destroyed
when it fails
● Need to manage this, e.g.
Google Home and Alexa
publish “questions you can ask”
7. AI Interfaces (4)
Other challenges:
● Privacy
● Unconventional interaction paradigms
● Unexpected / not understood command
8. AI Interfaces (5)
Empathy in design:
● Understand how the user will experience the product
● Design to meet real user needs
● Handle what the user tries to do
And …
● Train with unbiased data
● Understand individual user context
● Throw in the occasional curveball
● Guide usage towards what it can do
9.
10. UX in Practice (1)
Want vs Need
● Users will say they “want” something
● UX research needs to determine what they “need” (their goals / motives)
● E.g. what are Netflix’s UX requirements?
● For digital products, research needs to be balanced (beware the Ivory Tower)
11. UX in Practice (2)
Stakeholder engagement
● Everyone should understand knowledge behind key decisions
Methods:
● Workshops
● Pop-up exhibits
● Collateral
● Brown bags
● Stakeholders do research
● Multi-disciplinary teams
12. UX in Practice (3)
Best practices for UX component in project teams:
● Multi-disciplinary teams
● Share knowledge throughout team
● Avoid “them and us”
Team composition:
● 1:10 seems to be a common ratio design / UX : developers
● But depends heavily on the products being built
13. UX in Practice (4)
One-liners:
● Where there is design convention, follow it
○ People follow their expectations (fruit juice package, menus, etc)
● “You Are Not the User” (Jakob Nielsen)
● There’s no point doing UX research to validate a product
which won’t be changed
● Embed research into the design process
● Without UXR we’d release a great game no one
understands how to play
14.
15. Rapid Iteration (1)
Methods to rapidly iterate a product design.
Two approaches showcased:
● BBC (News and Weather)
● DICE (Battlefield V)
Highly resourced organisations:
● Large design / development teams
● Large user base for testing
16. Rapid Iteration (2)
BBC ran rapid research trials over a six month period.
Twelve UX research sessions:
● Rehearsed session with specified research objective
● 5-6 participants (recruited through agency)
● ~1 hour sessions (too long)
● < 2 day turn around of analysis and results
● Repeat fortnightly (too frequent)
● Feed outcome to designers for immediate inclusion in product
Allowed rapid fine-tuning of live News and Weather products.
17. Rapid Iteration (3)
DICE - UX research applied to Battlefield V (still ongoing)
Levels of UX research:
● Small groups: test general UX
● Focused groups: test game concepts
● Community servers: test game design / balance
Testing occurs at all stages:
● Paper and prototype testing
● Does not end at release - e.g. Battlefront II ;)