Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Jamila Smith-Loud - Understanding Human Impact: Social and Equity Assessments for AI Technologies

410 views

Published on

Understanding Human Impact: Social and Equity Assessments for AI Technologies

Social and Equity Impact Assessments have broad applications but can be a useful tool to explore and mitigate for Machine Learning fairness issues and can be applied to product specific questions as a way to generate insights and learnings about users, as well as impacts on society broadly as a result of the deployment of new and emerging technologies.

In this presentation, my goal is to advocate for and highlight the need to consult community and external stakeholder engagement to develop a new knowledge base and understanding of the human and social consequences of algorithmic decision making and to introduce principles, methods and process for these types of impact assessments.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Jamila Smith-Loud - Understanding Human Impact: Social and Equity Assessments for AI Technologies

  1. 1. Proprietary + Confidential Understanding Human Impact: Social and Equity Assessments for AI Technologies Jamila Smith-Loud Google, Trust and Safety Research
  2. 2. What are Social + Equity Impact Assessments Social Equity Impact Assessments assess the anticipated or contemporary socio-economic (change) both long-term and short term for a target population (determined by membership in a specific sensitive or legally protected sub-group or by geographic location of population) as a result of direct or indirect engagement with a product.
  3. 3. Impacts + People Social Impacts ● Living Conditions (change in income, inequality, poverty concentration) ● Governance/Rights (human rights and civil rights) ● Social Cultural (representational harm, change in community cohesion) ● Employment ● Health ● Social and Physical Infrastructure ● Environment ● Satisfaction (uncertainty about social change) Marginalized Populations ● Aboriginal/Indigenous peoples ● Age-related groups ● Disability ● Historically oppressed ethnic/racial communities ● Non-binary gender Identity ● Homeless/Underhoused ● Inner-urban communities ● Rural communities ● LGBQT ● Women and Girls ● Immigrants, Refugees and Migrants ● Other: Any other groups who has experienced systematic marginalization
  4. 4. Proprietary + Confidential Approach Center Center the voices and experiences of those communities who often bear the burden of the negative impacts Anticipate Anticipate potential negative and unintended consequences Engage Openly address issues of racism, social class, sexism, xenophobia, homophobia and all forms of cultural prejudice and intolerance
  5. 5. Engage in Hard Questions 1. What is the historical and current context that is shaping this issue? 2. What other forms of inequity are intersecting with this issues (gender, race, ability status, class/income, sexual orientation)? 3. How does power influence outcomes and feasibility of interventions?
  6. 6. Mitigations Translating findings of risk and potential impacts to product mitigations and improvements Likelihood & severity of risk Social + Equity related impact assessments Product & domain specific context application Impact Assessments ContextRisk Anticipatory Process
  7. 7. Conceptualizing Impacted Users Vulnerability Increased risk of impact as a result of external social, economic or cultural conditions Susceptibility Increased risk of impact related to endogenous factors such as individual income, employment status or education levels Marginalization Increased risk of impact due to systemic or institutional exclusion
  8. 8. Identifying Impacted Users
  9. 9. Disaggregation of key indicators including relevant social and economic data to scope impacts as a potential consequence of the machine learning technology ➔ Can the indicator of interest be disaggregated? Maybe by race, gender, geography or income. ➔ Are you able to drill down within a category, the more precise your understanding can become ➔ More detailed data than national averages is key in identifying and understanding potential user impacts ➔ Collection of data to allow disaggregation may require alternate sampling and data collection approaches ➔ Work to provide a research-based explanation for data that show inequities. Otherwise your audience will supply their own explanation, and this is where stereotypes too often fill the gap
  10. 10. Food Access + Income Disparity
  11. 11. To reduce negative unintended consequences in areas where access to quality food is an issue Purpose Little to no access to healthy food options, increasing gentrification, displacement, community distrust of govt. & tech Context Toward a Logical Process: Anticipating + Centering
  12. 12. Time, Gaps in knowledge and comfort 13 Identify Resources Acknowledge Constraints Multidisciplinary research teams, external stakeholders, Xfn relationships
  13. 13. Product interventions & community based mitigations Create Tangible Outputs Case study, survey research, focus groups Activities & Inputs
  14. 14. . C Aspire toward positive effects in principle and practice Towards operationalizing fair & equitable technologies

×