Diversity-friendly software
Shireen Mitchell - @digitalsista
Jon Pincus - @jdp23
SXSW 2017
References: http://bit.ly/div5y-links
Agenda
Anti-patterns
Best practices
Emerging techniques
Challenges
A path forward
Three aspects of diversity in software
The people and teams that create software
The processes we use to create software
Supporting diversity in the software itself
Software today works best for
people like its creators
Software reinforces power dynamics
SoftwarePeople and
Communities
create
empowers
Agenda
Anti-patterns
Best practices
Emerging techniques
Challenges
A path forward
Best practices
Diverse representation, inclusive culture, equitable
policies
Setting intention - intersectionally
Accessibility
Flexible self-identification of gender, race,and pronouns
Questions: diversity, inclusion, equity
How diverse is your team and community?
How inclusive is your culture?
How equitable is your organization?
What are you doing to increase diversity, inclusion, and equity?
How committed is the senior leadership - and the entire organization?
Who's responsible for it - and do they have the time and resources to make
an impact?
Setting intention - intersectionally
Make diversity a priority
Educate your team and
community
Avoid problematic language in
products and code
Intention: Questions to ask yourself
Is diversity as a goal and a priority?
If so, how are you communicating that internally and externally?
Do you have a Diversity Statement and a Code of Conduct?
Do people in the organization understand microagressions?
How are you educating your team and community?
Are you encouraging a "call-in" culture?
What standards, tools, and processes do you have about problematic
language?
Accessibility
Design for people with disabilities from the
beginning, including
Supporting screen readers and other assistive
technologies
Keyboard-only navigation
Color vision impairment
Hard of hearing or deaf
Seizure disorders and cognitive disabilities
Accessibility: Questions to ask yourself
Are you designing for accessibility from the beginning?
Do you have accessibility expertise throughout the team?
Do you include accessibility in your testing?
Does everybody on the team use the software in accessibility modes?
What automated accessibility tools do you use?
What are you doing to increase awareness and understanding?
Flexible, optional self-identification
Allow flexibility (not just a fixed list)
Allow multiple choices; e.g., somebody
who’s multiracial may be black and
Latinx
Let people decline to answer
Avoid the term “other”
Let people choose the pronouns they
prefer
Self-identification: Questions to ask yourself
Do people have a flexible way to specify their race, gender, orientation, ...?
Does it support multiracial, asexual, gender-fluid, and other frequently-
marginalized people?
Is it optional?
Can people choose their preferred pronouns?
Have you listened to feedback from a diverse group of people?
Agenda
Anti-patterns
Best practices
Emerging techniques
Challenges
A path forward
Emerging Techniques
Gender HCI
Threat modeling for harassment
Avoiding algorithmic bias
Gender HCI
Gender differences in human-computer interaction
Gender HCI: five key facets
Self-efficacy: how confident are people in their abilities?
Information-processing style: start by gathering fairly complete
information, or try something promising and backtrack if necessary?
Risk aversion: how comfortable are people with risk?
Tinkering: how much do people playfully experiment with the software?
Motivation: interest in technology for its own sake, or in aid of
accomplishing something?
Gender Mag
A gender-specialized
cognitive walkthrough
and a set of four personas,
for finding gender HCI
problems in software
http://gendermag.org/
Threat modeling for harassment
Threat modeling is a standard security technique
An adversary attacks the systems
Defenses prevent attacks
Mitigations reduce the effect
Harassers = adversaries
Work with experts (aka targets) as well as studying
patterns of harassment techniques
Algorithmic biases
Analyze algorithms for “fairness”
What does “fairness” mean in your context?
Work with social scientists as well as techies
Be wary of biases in training sets and historical data
Biases can be race, gender, class, cultural, urban/rural, age, ...
Make sure algorithms transparent enough that you can analyze them for
fairness
Agenda
Anti-patterns
Best practices
Emerging techniques
Challenges
A path forward
Challenges - and responses
Diversity is typically seen as low priority and divisive
Choose to prioritize - in an inclusive way
See diversity as an asset to product development
Look for ways diversity can give a strategic advantage
Diversity failures can have huge financial, PR, and strategic
consequences
Inability to pivot, expand the audience, exit, ...
Challenges - and responses
Lack of knowledge about diversity
Treat it just like you would any other key skill
your organization doesn’t have enough of
Budget time and money for training, education,
consultants
Challenges - and responses
Investment patterns favor cis straight
white male focused products
Look for non-traditional investments
(crowdfunding, etc.)
We’re seeing more forward-looking
decision-makers who get it
We need a few breakthrough
success as proof points!
Agenda
Anti-patterns
Best practices
Emerging techniques
Challenges
A path forward
Start with the highly marginalized
Women of color
Gender-diverse people
People with disabilities - including “invisible disabilities”
...
It seems easier to start designing for the usual suspects
(cis straight white able-bodied techie guys)
But that leaves you with a diversity debt
Start with the highly marginalized
There are great designers and developers
from marginalized communities out there
Get them on your team - as leaders
Bring them in as consultants, early users, beta testers,
advisors
Prioritize their needs
The industry is primed for change
Let’s create a virtuous cycle!
Software that
embeds
diversity
Diverse,
Inclusive
People and
Communities
create
empowers
Diversity-friendly software
Shireen Mitchell - @digitalsista
Jon Pincus - @jdp23
SXSW 2017
References: http://bit.ly/div5y-links

Diversity-friendly software (SXSW 2017)

  • 1.
    Diversity-friendly software Shireen Mitchell- @digitalsista Jon Pincus - @jdp23 SXSW 2017 References: http://bit.ly/div5y-links
  • 3.
  • 4.
    Three aspects ofdiversity in software The people and teams that create software The processes we use to create software Supporting diversity in the software itself
  • 5.
    Software today worksbest for people like its creators
  • 13.
    Software reinforces powerdynamics SoftwarePeople and Communities create empowers
  • 14.
  • 15.
    Best practices Diverse representation,inclusive culture, equitable policies Setting intention - intersectionally Accessibility Flexible self-identification of gender, race,and pronouns
  • 16.
    Questions: diversity, inclusion,equity How diverse is your team and community? How inclusive is your culture? How equitable is your organization? What are you doing to increase diversity, inclusion, and equity? How committed is the senior leadership - and the entire organization? Who's responsible for it - and do they have the time and resources to make an impact?
  • 17.
    Setting intention -intersectionally Make diversity a priority Educate your team and community Avoid problematic language in products and code
  • 18.
    Intention: Questions toask yourself Is diversity as a goal and a priority? If so, how are you communicating that internally and externally? Do you have a Diversity Statement and a Code of Conduct? Do people in the organization understand microagressions? How are you educating your team and community? Are you encouraging a "call-in" culture? What standards, tools, and processes do you have about problematic language?
  • 19.
    Accessibility Design for peoplewith disabilities from the beginning, including Supporting screen readers and other assistive technologies Keyboard-only navigation Color vision impairment Hard of hearing or deaf Seizure disorders and cognitive disabilities
  • 20.
    Accessibility: Questions toask yourself Are you designing for accessibility from the beginning? Do you have accessibility expertise throughout the team? Do you include accessibility in your testing? Does everybody on the team use the software in accessibility modes? What automated accessibility tools do you use? What are you doing to increase awareness and understanding?
  • 21.
    Flexible, optional self-identification Allowflexibility (not just a fixed list) Allow multiple choices; e.g., somebody who’s multiracial may be black and Latinx Let people decline to answer Avoid the term “other” Let people choose the pronouns they prefer
  • 22.
    Self-identification: Questions toask yourself Do people have a flexible way to specify their race, gender, orientation, ...? Does it support multiracial, asexual, gender-fluid, and other frequently- marginalized people? Is it optional? Can people choose their preferred pronouns? Have you listened to feedback from a diverse group of people?
  • 23.
  • 24.
    Emerging Techniques Gender HCI Threatmodeling for harassment Avoiding algorithmic bias
  • 25.
    Gender HCI Gender differencesin human-computer interaction
  • 26.
    Gender HCI: fivekey facets Self-efficacy: how confident are people in their abilities? Information-processing style: start by gathering fairly complete information, or try something promising and backtrack if necessary? Risk aversion: how comfortable are people with risk? Tinkering: how much do people playfully experiment with the software? Motivation: interest in technology for its own sake, or in aid of accomplishing something?
  • 27.
    Gender Mag A gender-specialized cognitivewalkthrough and a set of four personas, for finding gender HCI problems in software http://gendermag.org/
  • 28.
    Threat modeling forharassment Threat modeling is a standard security technique An adversary attacks the systems Defenses prevent attacks Mitigations reduce the effect Harassers = adversaries Work with experts (aka targets) as well as studying patterns of harassment techniques
  • 29.
    Algorithmic biases Analyze algorithmsfor “fairness” What does “fairness” mean in your context? Work with social scientists as well as techies Be wary of biases in training sets and historical data Biases can be race, gender, class, cultural, urban/rural, age, ... Make sure algorithms transparent enough that you can analyze them for fairness
  • 30.
  • 31.
    Challenges - andresponses Diversity is typically seen as low priority and divisive Choose to prioritize - in an inclusive way See diversity as an asset to product development Look for ways diversity can give a strategic advantage Diversity failures can have huge financial, PR, and strategic consequences Inability to pivot, expand the audience, exit, ...
  • 32.
    Challenges - andresponses Lack of knowledge about diversity Treat it just like you would any other key skill your organization doesn’t have enough of Budget time and money for training, education, consultants
  • 33.
    Challenges - andresponses Investment patterns favor cis straight white male focused products Look for non-traditional investments (crowdfunding, etc.) We’re seeing more forward-looking decision-makers who get it We need a few breakthrough success as proof points!
  • 34.
  • 35.
    Start with thehighly marginalized Women of color Gender-diverse people People with disabilities - including “invisible disabilities” ... It seems easier to start designing for the usual suspects (cis straight white able-bodied techie guys) But that leaves you with a diversity debt
  • 36.
    Start with thehighly marginalized There are great designers and developers from marginalized communities out there Get them on your team - as leaders Bring them in as consultants, early users, beta testers, advisors Prioritize their needs
  • 37.
    The industry isprimed for change
  • 39.
    Let’s create avirtuous cycle! Software that embeds diversity Diverse, Inclusive People and Communities create empowers
  • 40.
    Diversity-friendly software Shireen Mitchell- @digitalsista Jon Pincus - @jdp23 SXSW 2017 References: http://bit.ly/div5y-links

Editor's Notes

  • #3 There’s lots of very valuable and important work going on to make participation more diverse (the first bullet). In this presentation, we’re focusing on the last two bullets.
  • #5 There’s lots of very valuable and important work going on to make participation more diverse (the first bullet). In this presentation, we’re focusing on the last two bullets.
  • #6 Given industry demographics: able-bodied, cis, straight, white and Asian, guys There are techniques for making software that embraces differences - but they are not yet widely adopted
  • #7 Digitall redlining after Trump: Real names and fake news on Facebook https://medium.com/@tressiemcphd/digital-redlining-after-trump-real-names-fake-news-on-facebook-af63bf00bf9e#.k9g7ema70 Fake news software issue: algorithms favor “fake news”; mandatory automated race and gender identification (as opposed to optional self-identification) allows affinity targeting to penalize people identified by the algorithms as “black” and “woman” “If you are a black woman, like me, that can mean my ability to promote my new book (that was advertising), communicate with my college students, share legitimate information or sources with those who cannot access the academy, and shape the preferences of people similarly marked as “black” and “woman” in Facebook’s affinity algorithms to skew away from class-based assumptions is severely undermined.” [Real names software issue: designing reporting system without threat modeling how it can be abused. Real names policy issue: burden falls disproportionately on marginalized people. See http://geekfeminism.wikia.com/wiki/Who_is_harmed_by_a_%22Real_Names%22_policy%3F for more But we’re covering this on the next slide. Process issue: prioritizing reports of “real name” violations over reports of racism and sexism] Propublica on excluding users by race https://www.propublica.org/article/facebook-lets-advertisers-exclude-users-by-race Software issue: mandatory automated race and gender identification (as opposed to optional self-identification) allows affinity targeting to penalize people identified by the algorithms as “black” or “Latino”
  • #8 http://blackyouthproject.com/leslie-macs-facebook-ban-is-the-latest-development-in-racially-biased-censorship/ software issue: designing reporting system without threat modeling how it can be abused.
  • #9 Software issues: - functionality favors harassers - lack of threat modeling, unintended consequences Also http://nymag.com/thecut/2016/08/a-timeline-of-leslie-joness-horrific-online-abuse.html http://fusion.net/story/327103/leslie-jones-twitter-racism/
  • #10 Software issues: - functionality favors harassers - lack of threat modeling, unintended consequences - attempts to improve the situation haven’t worked. Software process issue: not working with the people who are being affected Process issues: Waited till after the election to enforce ToS. Gave voice to alt-righters, while allowing silencing women of color (especially black women). (here I have the data and the WoC who were hacked while they were engaging folks to get out the vote. Twitter did not respond or fix their accounts while those that broke ToS weren’t removed until after the election. https://www.buzzfeed.com/charliewarzel/a-honeypot-for-assholes-inside-twitters-10-year-failure-to-s?utm_term=.ohyn7BZyz#.idBwk0BKo
  • #11 company culture of sexism and acceptable sexual misconduct overflowed into their product. Drivers were also sexually assaulting their customer base with little to no actions from Uber. It was clear this was an internal company practice so it was implemented in the software.
  • #12 http://www.huffingtonpost.com/eric-holder/airbnbs-work-to-fight-bias-and-discrimination_b_11910438.html https://www.technologyreview.com/s/602355/airbnb-isnt-really-confronting-its-racism-problem/ Ben Edelman’s “Preventing Discrimination at Airbnb” http://www.benedelman.org/news/062316-1.html The solution: Limit the distribution of irrelevant information that facilitates discrimination - hide names and photos. “Names and photos typically indicate the races of Airbnb guests and hosts. But names and photos are not necessary for guests and hosts to do business. Hosts and guests can amply assess one anothers' trustworthiness using the significant other information Airbnb already collects and presents. For these reasons, I contend that the Airbnb site should not reveal sensitive race information until a transaction is confirmed. If guests and hosts don't see names and photos in advance, they simply won't be able to discriminate on that basis.” Allow testing
  • #13 https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing Software issue: algorithmic bias
  • #14 Examples: Twitter not providing tools for people to protect themselves against harassment and abuse Lack of accessibility support in software means that a lot of people can’t even use it Sites that limit gender to male/female, or male/female/other Google photos tagging black people as gorillas: http://www.usatoday.com/story/tech/2015/07/01/google-apologizes-after-photos-identify-black-people-as-gorillas/29567465/ Google’s autocomplete for “women should” … stay at home, be slaves, be in the kitchen http://dhpoco.org/blog/2013/11/19/googles-autocompletion-algorithms-stereotypes-and-accountability/ Google “three black teenagers” as opposed to “three white teenagers” http://www.usatoday.com/story/tech/news/2016/06/09/google-image-search-three-black-teenagers-three-white-teenagers/85648838/ Racial biases in software used to predict future criminal behavior - used in sentencing https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
  • #18 Dreamwidth and Django diversity statements
  • #26 Work to date focuses on males and females, problem-solving software
  • #28 Work to date focuses on males and females, problem-solving software
  • #29 AKA “what Twitter doesn’t do”
  • #38  Software that truly embraces differences works better for everybody Despite the tech industry’s huge diversity problems, there are a lot of talented women, blacks, Latinxs, Native Americans, QUILTBAGs, seniors, people with disabilities … and we want software that solves our own problems!