DESIGNING INCLUSIVE PRODUCTS
Conférence de Sara Wachter-Boettcher
Même si nos projets commencent toujours avec de très bonnes intentions, on se retrouve souvent avec des produits pilotés par des stéréotypes toxiques qui sont véhiculés dans la culture tech : des balances connectées qui déduisent que tout le monde veut perdre du poids, des formulaires qui ne correspondent pas aux personnes transgenres, des bots lecteurs de curriculum qui désavantagent les femmes,les systèmes de reconnaissance faciale qui n’arrivent pas à identifier les personnes de couleur…
Aujourd’hui, la technologie a pris une place centrale dans le quotidien de nos utilisateurs jusqu’à aller dans le moindre recoin de leur intimité, nous avons notre part de responsabilité par nos décisions lorsque ces produits incluent ou rejettent des utilisateurs. Durant cette keynote, Sara Wachter-Boettcher expliquera comment la culture de l’industrie tech est en train de créer des produits qui exploitent des préjugés, manipulent et blessent des utilisateurs, mais nuisent également à la démocratie. Elle nous présentera également le comportement à adopter pour éviter de créer ce type de produit : comment concevoir des hypothèses dans notre processus de travail, vérifier que les décisions prises autour du produit soient correctes auprès de différents utilisateurs et différentes situations, avoir des discussions avec ses équipes et les entreprises et poursuivre une démarche de conception plus éthique et inclusive pour notre industrie.
Plus d'infos sur : uxday.flupa.eu/designing-inclusive-products
3. ‘‘One Spanish-speaking respondent said she was
uncomfortable “registering” other household members…
when she realized she would have to provide information
on others who live with her. She mentioned being afraid
because of the current political climate and news reports
about changing immigration policy.
—Center for Survey Measurement
“Pretesting” usability study report, 2017
4. ‘‘A second Spanish-speaking respondent filled out
information about herself and three family members but
intentionally left three or four roomers off the roster
because, “This frightens me, given how the situation is
now” and mentioned being worried because of their
“[immigration] status.”
—Center for Survey Measurement
“Pretesting” usability study report, 2017
5. • How many members of Congress does
your state get?
• Where are voting district boundaries?
• How does infrastructure like roads and
bridges get planned?
• Where do we build schools and
libraries?
12. To recap:
• There’s no way to turn it off
• This is dangerous for people with eating disorders
• This feels shamey
• “Average” calorie counts are wildly inaccurate
• Not all calories are equal
• A cupcake is not a useful metric
• Pink cupcakes are not neutral—they have social and
culture encoding (feminine, white, middle class)
• This perpetuates diet culture
17. ‘‘Silicon Valley is run by people [who] want to be in the
tech business, but are in the people business. They are
way, way in over their heads.
—Zeynep Tufekci
18. ‘‘Twitter is failing in its
responsibility to respect
women’s rights online by
inadequately investigating
and responding to reports of
violence and abuse in a
transparent manner.
—Amnesty International,
March 2018
19. ‘‘We love instant, public, global messaging… But we
didn’t fully predict or understand the real-world negative
consequences. We acknowledge that now…
We aren’t proud of how people have taken advantage of
our service, or our inability to address it fast enough.
—Jack Dorsey, March 2018
20. ‘‘We suck at dealing with abuse and trolls on the platform
and we’ve sucked at it for years… We lose core user after
core user by not addressing simple trolling issues that
they face every day. I’m frankly ashamed of how poorly
we’ve dealt with this issue during my tenure as CEO. It’s
absurd. There’s no excuse for it.
—Dick Costolo, February 2015
22. • Cartoons about torture and suicide
• Sexualized characters and themes
• Violence and weapons
• Kids being tied up and hurt
• Kids vomiting and writhing in pain
24. Image: Google I/O
• How will we trust who we’re talking to?
• Is it OK for a bot to pretend it’s human?
• What does recording the call mean for
privacy?
• Is it fair to outsource task to bots and
expect low-wage workers to deal with
it?
25. This isn’t just about
moderation. This is
about product design.
30. ‘‘We stress-tested Tay under a variety of
conditions, specifically to make interacting with
Tay a positive experience.
—Peter Lee, Microsoft Research
31. ‘‘We stress-tested Tay under a variety of
conditions, specifically to make interacting with
Tay a positive experience.
—Peter Lee, Microsoft Research
41. ‘‘We are deeply sorry for this unquestionably serious
issue. It is an unfortunate side-effect of the underlying
neural network caused by the training set bias, not
intended behaviour.
— Yaroslav Goncharov, FaceApp CEO
42. ‘‘We are deeply sorry for this unquestionably serious
issue. It is an unfortunate side-effect of the underlying
neural network caused by the training set bias, not
intended behaviour.
— Yaroslav Goncharov, FaceApp CEO
45. ‘‘We’re also working on longer-term fixes around both
linguistics (words to be careful about in photos of people)
… and image recognition itself (e.g., better recognition of
dark-skinned faces).
—Yonatan Zunger, former Googler
46. ‘‘We’re also working on longer-term fixes around both
linguistics (words to be careful about in photos of people)
… and image recognition itself (e.g., better recognition of
dark-skinned faces).
—Yonatan Zunger, former Googler
47. ‘‘When the person in the photo is a white man, the software
is right 99 percent of the time. But the darker the skin, the
more errors arise — up to nearly 35 percent for images of
darker skinned women.
—“Facial Recognition Is Accurate, if You’re a White Guy”
by Steve Lohr, The New York Times
48. ‘‘When the person in the photo is a white man, the software
is right 99 percent of the time. But the darker the skin, the
more errors arise — up to nearly 35 percent for images of
darker skinned women.
—“Facial Recognition Is Accurate, if You’re a White Guy”
by Steve Lohr, The New York Times
59. • The user had a good year.
• The user wants to relive their year.
• The user wants to share their year.
• The user’s most popular content is
positive.
60. • Identity
• Location
• Emotional state
• Physical state
• Personal history
• Lifestyle
• Goals
• Pain points
61.
62. ‘‘My hair type is what’s called ‘4C hair,’ given the level
of coiliness. I learned that I needed to add that to my
searches in order to find things. It shouldn’t be that
way.
— Candice Morgan,
Head of Diversity & Inclusion, Pinterest
67. Make inclusive design an explicit part of:
• Project roles and responsibilities
• Roadmapping and project planning
• Use cases and scenarios
• Content crits and editing cycles
• Project postmortems
• Employee evaluations
70. ‘‘We’re an idealistic and optimistic company. For the first
decade, we really focused on all the good that
connecting people brings… But it’s clear now that we
didn’t do enough. We didn’t focus enough on preventing
abuse and thinking through how people could use these
tools to do harm as well.
—Mark Zuckerberg, April 2018
71. ‘‘As most companies grow, they slow down too much
because they’re more afraid of making mistakes than
they are of losing opportunities by moving too slowly.
We have a saying: “Move fast and break things.” The
idea is that if you never break anything, you’re probably
not moving fast enough.
—Mark Zuckerberg, 2012
72. ‘‘As most companies grow, they slow down too much
because they’re more afraid of making mistakes than
they are of losing opportunities by moving too slowly.
We have a saying: “Move fast and break things.” The
idea is that if you never break anything, you’re probably
not moving fast enough.
—Mark Zuckerberg, 2012
73. If you never slow down,
you probably have no
idea what you’re
breaking.