SlideShare a Scribd company logo
PAGE 1 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
#GHC17
AI581: Presentations: AI for Social
Good
Bias In Artificial Intelligence
Neelima Kumar | @Neelima_jadhav
PAGE 2 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
HUMAN BIAS
Picture a Nurse
PAGE 3 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Is AI BIAS?
PAGE 4 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Machine Learning : Learn from Data
PAGE 5 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
AI impacts lives
Transportation
Speech to Voice
Banking
Recruitment
Advertising
Predictive Policing
Health and Medicine
PAGE 6 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Word Embeddings
You shall know a word by the
company it keeps
-Firth, J.R. 1957:11
PAGE 7 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Associations Generated by Word2Vec
Man: Boy :: Women: x (x = Girl)
PAGE 8 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Stereotypes in word embeddings
Father : Doctor :: Mother : Nurse
Man : Programmer :: Woman : Homemaker
He: Realist :: She: Feminist
She: Pregnancy :: He: Kidney Stone
PAGE 9 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Stereotypes in Google Translate
PAGE 10 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Cultural Bias
PAGE 11 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Racial bias
PAGE 12 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
PAGE 13 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Class Discrimination( Who uses AI matters)
PAGE 14 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
How is bias introduced in AI?
Training data
is collected
and annoted
Model is
trained
Output
Margaret Mitchell, 2017
PAGE 15 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
How is bias introduced in AI?
Training data
is collected
and annoted
Model is
trained
Bias
Bias
Bias
Biased data created from process becomes new training data
Output
Margaret Mitchell, 2017
PAGE 16 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Hard things are hard
• Hard to get Clean Data
• Decisions not clearly understood
• Lack of Diversity
• Impact on Accuracy
PAGE 17 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Awareness and Inclusion
• Awareness of possible biases
• Design for inclusion and diversity
• Work with communities affected most
• More Women and minorities Developers
PAGE 18 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
Explainability and Accountability
• Explanation of individual decisions
• Characterize strengths & weaknesses
• Predict future behavior
• Transparency of Data used for training
• Record decisions to that they could be audited
• Validation and Testing
PAGE 19 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017
PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
FEEDBACK? RATE AND REVIEW THE SESSION ON OUR MOBILE APP
Download the GHC 17 app at http://bit.ly/ghc17app or search GHC 2017 in the app store
Thank you

More Related Content

What's hot

Algorithmic Bias: Challenges and Opportunities for AI in Healthcare
Algorithmic Bias:  Challenges and Opportunities for AI in HealthcareAlgorithmic Bias:  Challenges and Opportunities for AI in Healthcare
Algorithmic Bias: Challenges and Opportunities for AI in Healthcare
Gregory Nelson
 
Algorithmic Bias - What is it? Why should we care? What can we do about it?
Algorithmic Bias - What is it? Why should we care? What can we do about it? Algorithmic Bias - What is it? Why should we care? What can we do about it?
Algorithmic Bias - What is it? Why should we care? What can we do about it?
University of Minnesota, Duluth
 
Fairness and Privacy in AI/ML Systems
Fairness and Privacy in AI/ML SystemsFairness and Privacy in AI/ML Systems
Fairness and Privacy in AI/ML Systems
Krishnaram Kenthapadi
 
Fairness in AI (DDSW 2019)
Fairness in AI (DDSW 2019)Fairness in AI (DDSW 2019)
Fairness in AI (DDSW 2019)
GoDataDriven
 
Model bias in AI
Model bias in AIModel bias in AI
Model bias in AI
Jason Tamara Widjaja
 
Ethical Considerations in the Design of Artificial Intelligence
Ethical Considerations in the Design of Artificial IntelligenceEthical Considerations in the Design of Artificial Intelligence
Ethical Considerations in the Design of Artificial Intelligence
John C. Havens
 
The Ethics of AI in Education
The Ethics of AI in EducationThe Ethics of AI in Education
The Ethics of AI in Education
Mark S. Steed
 
Racial and gender bias in AI
Racial and gender bias in AI Racial and gender bias in AI
Racial and gender bias in AI
Data Products Meetup
 
How do we train AI to be Ethical and Unbiased?
How do we train AI to be Ethical and Unbiased?How do we train AI to be Ethical and Unbiased?
How do we train AI to be Ethical and Unbiased?
Mark Borg
 
Ethical issues facing Artificial Intelligence
Ethical issues facing Artificial IntelligenceEthical issues facing Artificial Intelligence
Ethical issues facing Artificial Intelligence
Rah Abdelhak
 
Technology for everyone - AI ethics and Bias
Technology for everyone - AI ethics and BiasTechnology for everyone - AI ethics and Bias
Technology for everyone - AI ethics and Bias
Marion Mulder
 
Introduction to AI Ethics
Introduction to AI EthicsIntroduction to AI Ethics
Introduction to AI Ethics
Gabriele Graffieti
 
The Ethics of AI
The Ethics of AIThe Ethics of AI
The Ethics of AI
Mark S. Steed
 
AI Governance and Ethics - Industry Standards
AI Governance and Ethics - Industry StandardsAI Governance and Ethics - Industry Standards
AI Governance and Ethics - Industry Standards
Ansgar Koene
 
Introduction to the ethics of machine learning
Introduction to the ethics of machine learningIntroduction to the ethics of machine learning
Introduction to the ethics of machine learning
Daniel Wilson
 
Fairness and Bias in Machine Learning
Fairness and Bias in Machine LearningFairness and Bias in Machine Learning
Fairness and Bias in Machine Learning
Surya Dutta
 
Responsible AI
Responsible AIResponsible AI
Responsible AI
Neo4j
 
Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (WS...
Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (WS...Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (WS...
Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (WS...
Krishnaram Kenthapadi
 
What is Artificial Intelligence | Artificial Intelligence Tutorial For Beginn...
What is Artificial Intelligence | Artificial Intelligence Tutorial For Beginn...What is Artificial Intelligence | Artificial Intelligence Tutorial For Beginn...
What is Artificial Intelligence | Artificial Intelligence Tutorial For Beginn...
Edureka!
 

What's hot (20)

Algorithmic Bias: Challenges and Opportunities for AI in Healthcare
Algorithmic Bias:  Challenges and Opportunities for AI in HealthcareAlgorithmic Bias:  Challenges and Opportunities for AI in Healthcare
Algorithmic Bias: Challenges and Opportunities for AI in Healthcare
 
Algorithmic Bias - What is it? Why should we care? What can we do about it?
Algorithmic Bias - What is it? Why should we care? What can we do about it? Algorithmic Bias - What is it? Why should we care? What can we do about it?
Algorithmic Bias - What is it? Why should we care? What can we do about it?
 
Fairness and Privacy in AI/ML Systems
Fairness and Privacy in AI/ML SystemsFairness and Privacy in AI/ML Systems
Fairness and Privacy in AI/ML Systems
 
Fairness in AI (DDSW 2019)
Fairness in AI (DDSW 2019)Fairness in AI (DDSW 2019)
Fairness in AI (DDSW 2019)
 
Model bias in AI
Model bias in AIModel bias in AI
Model bias in AI
 
Ethical Considerations in the Design of Artificial Intelligence
Ethical Considerations in the Design of Artificial IntelligenceEthical Considerations in the Design of Artificial Intelligence
Ethical Considerations in the Design of Artificial Intelligence
 
The Ethics of AI in Education
The Ethics of AI in EducationThe Ethics of AI in Education
The Ethics of AI in Education
 
Racial and gender bias in AI
Racial and gender bias in AI Racial and gender bias in AI
Racial and gender bias in AI
 
How do we train AI to be Ethical and Unbiased?
How do we train AI to be Ethical and Unbiased?How do we train AI to be Ethical and Unbiased?
How do we train AI to be Ethical and Unbiased?
 
Ethical issues facing Artificial Intelligence
Ethical issues facing Artificial IntelligenceEthical issues facing Artificial Intelligence
Ethical issues facing Artificial Intelligence
 
Technology for everyone - AI ethics and Bias
Technology for everyone - AI ethics and BiasTechnology for everyone - AI ethics and Bias
Technology for everyone - AI ethics and Bias
 
Introduction to AI Ethics
Introduction to AI EthicsIntroduction to AI Ethics
Introduction to AI Ethics
 
Ethics and AI
Ethics and AIEthics and AI
Ethics and AI
 
The Ethics of AI
The Ethics of AIThe Ethics of AI
The Ethics of AI
 
AI Governance and Ethics - Industry Standards
AI Governance and Ethics - Industry StandardsAI Governance and Ethics - Industry Standards
AI Governance and Ethics - Industry Standards
 
Introduction to the ethics of machine learning
Introduction to the ethics of machine learningIntroduction to the ethics of machine learning
Introduction to the ethics of machine learning
 
Fairness and Bias in Machine Learning
Fairness and Bias in Machine LearningFairness and Bias in Machine Learning
Fairness and Bias in Machine Learning
 
Responsible AI
Responsible AIResponsible AI
Responsible AI
 
Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (WS...
Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (WS...Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (WS...
Fairness-aware Machine Learning: Practical Challenges and Lessons Learned (WS...
 
What is Artificial Intelligence | Artificial Intelligence Tutorial For Beginn...
What is Artificial Intelligence | Artificial Intelligence Tutorial For Beginn...What is Artificial Intelligence | Artificial Intelligence Tutorial For Beginn...
What is Artificial Intelligence | Artificial Intelligence Tutorial For Beginn...
 

Similar to Bias in Artificial Intelligence

Consciously Tackling Unconscious Bias
Consciously Tackling Unconscious BiasConsciously Tackling Unconscious Bias
Consciously Tackling Unconscious Bias
Lilit Yenokyan
 
Tips and techniques for hyperparameter optimization
Tips and techniques for hyperparameter optimizationTips and techniques for hyperparameter optimization
Tips and techniques for hyperparameter optimization
SigOpt
 
GHC17 Telecommuting Presentation
GHC17 Telecommuting PresentationGHC17 Telecommuting Presentation
GHC17 Telecommuting Presentation
maytaldahan
 
Plan Your Career GHC18
Plan Your Career GHC18Plan Your Career GHC18
Plan Your Career GHC18
Susan McKenzie
 
How I built a software business to help Mothers like me?
How I built a software business to help Mothers like me?How I built a software business to help Mothers like me?
How I built a software business to help Mothers like me?
Nirupama Mallavarupu
 
Predictive Analytics -Workshop
Predictive Analytics -WorkshopPredictive Analytics -Workshop
Predictive Analytics -Workshop
subbul
 
Hello Watch! Build your First Apple Watch App
Hello Watch! Build your First Apple Watch AppHello Watch! Build your First Apple Watch App
Hello Watch! Build your First Apple Watch App
Kristina Fox
 
GHC16_BuildingResiliencyInMulti-tierSystems
GHC16_BuildingResiliencyInMulti-tierSystemsGHC16_BuildingResiliencyInMulti-tierSystems
GHC16_BuildingResiliencyInMulti-tierSystems
Shreya Mukhopadhyay
 
Self-Programming Artificial Intelligence Grace Hopper GHC 2018 GHC18
Self-Programming Artificial Intelligence Grace Hopper GHC 2018 GHC18Self-Programming Artificial Intelligence Grace Hopper GHC 2018 GHC18
Self-Programming Artificial Intelligence Grace Hopper GHC 2018 GHC18
Kory Becker
 
The Future of Voice [Webdagene 2017]
The Future of Voice [Webdagene 2017]The Future of Voice [Webdagene 2017]
The Future of Voice [Webdagene 2017]
Cheryl Platz
 
DO-AS-CASPOTT: DevOps AutoScaler and Critical Artifact Spotter
DO-AS-CASPOTT: DevOps AutoScaler and Critical Artifact SpotterDO-AS-CASPOTT: DevOps AutoScaler and Critical Artifact Spotter
DO-AS-CASPOTT: DevOps AutoScaler and Critical Artifact Spotter
Harini Gunabalan
 
When Data is Your Product: Empowering Business Users
When Data is Your Product: Empowering Business UsersWhen Data is Your Product: Empowering Business Users
When Data is Your Product: Empowering Business Users
Intuit Inc.
 
Empowered Leadership through Managing Energy
Empowered Leadership through Managing EnergyEmpowered Leadership through Managing Energy
Empowered Leadership through Managing Energy
Vishweshwar Hegde
 
smart-dustbins
smart-dustbinssmart-dustbins
smart-dustbinshkasera
 
ABI-2015-ImpactReport_Final
ABI-2015-ImpactReport_FinalABI-2015-ImpactReport_Final
ABI-2015-ImpactReport_FinalPreeti Upadhyaya
 
Case Study On Work Of RED At Garinda
Case Study On Work Of RED At GarindaCase Study On Work Of RED At Garinda
Case Study On Work Of RED At GarindaKeerthi Kiran K
 
A Case Study on Rural Entrepreneurship
A Case Study on Rural EntrepreneurshipA Case Study on Rural Entrepreneurship
A Case Study on Rural EntrepreneurshipAbhilash Ravishankar
 
GHC 2016 design thinking
GHC 2016   design thinkingGHC 2016   design thinking
GHC 2016 design thinking
Kathryn Kuhn
 
PDF GM China CSR Newsletter_Autumn 2016_EN
PDF GM China CSR Newsletter_Autumn 2016_ENPDF GM China CSR Newsletter_Autumn 2016_EN
PDF GM China CSR Newsletter_Autumn 2016_ENJasmine Liao
 

Similar to Bias in Artificial Intelligence (19)

Consciously Tackling Unconscious Bias
Consciously Tackling Unconscious BiasConsciously Tackling Unconscious Bias
Consciously Tackling Unconscious Bias
 
Tips and techniques for hyperparameter optimization
Tips and techniques for hyperparameter optimizationTips and techniques for hyperparameter optimization
Tips and techniques for hyperparameter optimization
 
GHC17 Telecommuting Presentation
GHC17 Telecommuting PresentationGHC17 Telecommuting Presentation
GHC17 Telecommuting Presentation
 
Plan Your Career GHC18
Plan Your Career GHC18Plan Your Career GHC18
Plan Your Career GHC18
 
How I built a software business to help Mothers like me?
How I built a software business to help Mothers like me?How I built a software business to help Mothers like me?
How I built a software business to help Mothers like me?
 
Predictive Analytics -Workshop
Predictive Analytics -WorkshopPredictive Analytics -Workshop
Predictive Analytics -Workshop
 
Hello Watch! Build your First Apple Watch App
Hello Watch! Build your First Apple Watch AppHello Watch! Build your First Apple Watch App
Hello Watch! Build your First Apple Watch App
 
GHC16_BuildingResiliencyInMulti-tierSystems
GHC16_BuildingResiliencyInMulti-tierSystemsGHC16_BuildingResiliencyInMulti-tierSystems
GHC16_BuildingResiliencyInMulti-tierSystems
 
Self-Programming Artificial Intelligence Grace Hopper GHC 2018 GHC18
Self-Programming Artificial Intelligence Grace Hopper GHC 2018 GHC18Self-Programming Artificial Intelligence Grace Hopper GHC 2018 GHC18
Self-Programming Artificial Intelligence Grace Hopper GHC 2018 GHC18
 
The Future of Voice [Webdagene 2017]
The Future of Voice [Webdagene 2017]The Future of Voice [Webdagene 2017]
The Future of Voice [Webdagene 2017]
 
DO-AS-CASPOTT: DevOps AutoScaler and Critical Artifact Spotter
DO-AS-CASPOTT: DevOps AutoScaler and Critical Artifact SpotterDO-AS-CASPOTT: DevOps AutoScaler and Critical Artifact Spotter
DO-AS-CASPOTT: DevOps AutoScaler and Critical Artifact Spotter
 
When Data is Your Product: Empowering Business Users
When Data is Your Product: Empowering Business UsersWhen Data is Your Product: Empowering Business Users
When Data is Your Product: Empowering Business Users
 
Empowered Leadership through Managing Energy
Empowered Leadership through Managing EnergyEmpowered Leadership through Managing Energy
Empowered Leadership through Managing Energy
 
smart-dustbins
smart-dustbinssmart-dustbins
smart-dustbins
 
ABI-2015-ImpactReport_Final
ABI-2015-ImpactReport_FinalABI-2015-ImpactReport_Final
ABI-2015-ImpactReport_Final
 
Case Study On Work Of RED At Garinda
Case Study On Work Of RED At GarindaCase Study On Work Of RED At Garinda
Case Study On Work Of RED At Garinda
 
A Case Study on Rural Entrepreneurship
A Case Study on Rural EntrepreneurshipA Case Study on Rural Entrepreneurship
A Case Study on Rural Entrepreneurship
 
GHC 2016 design thinking
GHC 2016   design thinkingGHC 2016   design thinking
GHC 2016 design thinking
 
PDF GM China CSR Newsletter_Autumn 2016_EN
PDF GM China CSR Newsletter_Autumn 2016_ENPDF GM China CSR Newsletter_Autumn 2016_EN
PDF GM China CSR Newsletter_Autumn 2016_EN
 

Recently uploaded

一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
v3tuleee
 
Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)
TravisMalana
 
Empowering Data Analytics Ecosystem.pptx
Empowering Data Analytics Ecosystem.pptxEmpowering Data Analytics Ecosystem.pptx
Empowering Data Analytics Ecosystem.pptx
benishzehra469
 
Best best suvichar in gujarati english meaning of this sentence as Silk road ...
Best best suvichar in gujarati english meaning of this sentence as Silk road ...Best best suvichar in gujarati english meaning of this sentence as Silk road ...
Best best suvichar in gujarati english meaning of this sentence as Silk road ...
AbhimanyuSinha9
 
standardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghhstandardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghh
ArpitMalhotra16
 
一比一原版(CBU毕业证)卡普顿大学毕业证成绩单
一比一原版(CBU毕业证)卡普顿大学毕业证成绩单一比一原版(CBU毕业证)卡普顿大学毕业证成绩单
一比一原版(CBU毕业证)卡普顿大学毕业证成绩单
nscud
 
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
ewymefz
 
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
nscud
 
My burning issue is homelessness K.C.M.O.
My burning issue is homelessness K.C.M.O.My burning issue is homelessness K.C.M.O.
My burning issue is homelessness K.C.M.O.
rwarrenll
 
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
Timothy Spann
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP
 
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
slg6lamcq
 
Q1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year ReboundQ1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year Rebound
Oppotus
 
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
axoqas
 
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
mbawufebxi
 
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
slg6lamcq
 
一比一原版(爱大毕业证书)爱丁堡大学毕业证如何办理
一比一原版(爱大毕业证书)爱丁堡大学毕业证如何办理一比一原版(爱大毕业证书)爱丁堡大学毕业证如何办理
一比一原版(爱大毕业证书)爱丁堡大学毕业证如何办理
g4dpvqap0
 
一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单
enxupq
 
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project PresentationPredicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Boston Institute of Analytics
 
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
axoqas
 

Recently uploaded (20)

一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理一比一原版(UofS毕业证书)萨省大学毕业证如何办理
一比一原版(UofS毕业证书)萨省大学毕业证如何办理
 
Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)Malana- Gimlet Market Analysis (Portfolio 2)
Malana- Gimlet Market Analysis (Portfolio 2)
 
Empowering Data Analytics Ecosystem.pptx
Empowering Data Analytics Ecosystem.pptxEmpowering Data Analytics Ecosystem.pptx
Empowering Data Analytics Ecosystem.pptx
 
Best best suvichar in gujarati english meaning of this sentence as Silk road ...
Best best suvichar in gujarati english meaning of this sentence as Silk road ...Best best suvichar in gujarati english meaning of this sentence as Silk road ...
Best best suvichar in gujarati english meaning of this sentence as Silk road ...
 
standardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghhstandardisation of garbhpala offhgfffghh
standardisation of garbhpala offhgfffghh
 
一比一原版(CBU毕业证)卡普顿大学毕业证成绩单
一比一原版(CBU毕业证)卡普顿大学毕业证成绩单一比一原版(CBU毕业证)卡普顿大学毕业证成绩单
一比一原版(CBU毕业证)卡普顿大学毕业证成绩单
 
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
一比一原版(UofM毕业证)明尼苏达大学毕业证成绩单
 
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
一比一原版(CBU毕业证)不列颠海角大学毕业证成绩单
 
My burning issue is homelessness K.C.M.O.
My burning issue is homelessness K.C.M.O.My burning issue is homelessness K.C.M.O.
My burning issue is homelessness K.C.M.O.
 
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Dat...
 
Criminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdfCriminal IP - Threat Hunting Webinar.pdf
Criminal IP - Threat Hunting Webinar.pdf
 
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
一比一原版(UniSA毕业证书)南澳大学毕业证如何办理
 
Q1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year ReboundQ1’2024 Update: MYCI’s Leap Year Rebound
Q1’2024 Update: MYCI’s Leap Year Rebound
 
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
做(mqu毕业证书)麦考瑞大学毕业证硕士文凭证书学费发票原版一模一样
 
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
一比一原版(Bradford毕业证书)布拉德福德大学毕业证如何办理
 
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
 
一比一原版(爱大毕业证书)爱丁堡大学毕业证如何办理
一比一原版(爱大毕业证书)爱丁堡大学毕业证如何办理一比一原版(爱大毕业证书)爱丁堡大学毕业证如何办理
一比一原版(爱大毕业证书)爱丁堡大学毕业证如何办理
 
一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单一比一原版(YU毕业证)约克大学毕业证成绩单
一比一原版(YU毕业证)约克大学毕业证成绩单
 
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project PresentationPredicting Product Ad Campaign Performance: A Data Analysis Project Presentation
Predicting Product Ad Campaign Performance: A Data Analysis Project Presentation
 
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
哪里卖(usq毕业证书)南昆士兰大学毕业证研究生文凭证书托福证书原版一模一样
 

Bias in Artificial Intelligence

  • 1. PAGE 1 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 #GHC17 AI581: Presentations: AI for Social Good Bias In Artificial Intelligence Neelima Kumar | @Neelima_jadhav
  • 2. PAGE 2 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 HUMAN BIAS Picture a Nurse
  • 3. PAGE 3 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Is AI BIAS?
  • 4. PAGE 4 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Machine Learning : Learn from Data
  • 5. PAGE 5 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 AI impacts lives Transportation Speech to Voice Banking Recruitment Advertising Predictive Policing Health and Medicine
  • 6. PAGE 6 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Word Embeddings You shall know a word by the company it keeps -Firth, J.R. 1957:11
  • 7. PAGE 7 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Associations Generated by Word2Vec Man: Boy :: Women: x (x = Girl)
  • 8. PAGE 8 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Stereotypes in word embeddings Father : Doctor :: Mother : Nurse Man : Programmer :: Woman : Homemaker He: Realist :: She: Feminist She: Pregnancy :: He: Kidney Stone
  • 9. PAGE 9 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Stereotypes in Google Translate
  • 10. PAGE 10 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Cultural Bias
  • 11. PAGE 11 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Racial bias
  • 12. PAGE 12 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
  • 13. PAGE 13 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Class Discrimination( Who uses AI matters)
  • 14. PAGE 14 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 How is bias introduced in AI? Training data is collected and annoted Model is trained Output Margaret Mitchell, 2017
  • 15. PAGE 15 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 How is bias introduced in AI? Training data is collected and annoted Model is trained Bias Bias Bias Biased data created from process becomes new training data Output Margaret Mitchell, 2017
  • 16. PAGE 16 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Hard things are hard • Hard to get Clean Data • Decisions not clearly understood • Lack of Diversity • Impact on Accuracy
  • 17. PAGE 17 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Awareness and Inclusion • Awareness of possible biases • Design for inclusion and diversity • Work with communities affected most • More Women and minorities Developers
  • 18. PAGE 18 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17 Explainability and Accountability • Explanation of individual decisions • Characterize strengths & weaknesses • Predict future behavior • Transparency of Data used for training • Record decisions to that they could be audited • Validation and Testing
  • 19. PAGE 19 | GRACE HOPPER CELEBRATION FOR WOMEN IN COMPUTING 2017 PRESENTED BY THE ANITA BORG INSTITUTE AND THE ASSOCIATION FOR COMPUTING MACHINERY #GHC17
  • 20. FEEDBACK? RATE AND REVIEW THE SESSION ON OUR MOBILE APP Download the GHC 17 app at http://bit.ly/ghc17app or search GHC 2017 in the app store Thank you

Editor's Notes

  1. Good afternoon everybody, The topic of my talk today is “Bias in Ai” Lets begin with a quick exercise , Close your eyes and 
  2. Picture a Nurse Did any one picture some of like this and how about this? No one?? We may not even know why but each one of us picked one image over the other We are all affected by our unconscious bias And while we all have our varying prejuides we are all just the same in having prejudices. What about AI --?
  3. IS artificial intelligence biased? People tend to think of AI systems as mathematical models that are rational and immune to any biases.
  4. The AI that I am referring to here is the field of machine learning, natural lang processing, neural nets and beyond. These techniques learn about the worlds experiences based on the huge amounts of data that they are trained on. The output of a system depends on its input and if the input is biased so will the output be. People tend to forget this, the thinking been that the vast amount of data could overwhelm any human biases but on the contrary AI systems will generate output with all its skews and biases intact. This has potentiatial to cause harm to real people in the read world. I am not worried abt the temr.. But the fact that AI is used so much in our daily lives affecting lives and livelihoods
  5. Ai is transforming the transportaing inductries. Its in our homes Banking /Financial institutions are using it to decide who to give credit and how much loan is offered. Its used by Hr to decide whom to fire/hire Its used by Advertsing companies to decide what ad/recommendations to show you. Its used by the justice depat to determine who goes to jail and for how long It used in Health industry to determing what medications you should take and when someone shold be hospitalized. Ai is affecting our core leag.. To set the stage let me walk you throu a few wgs
  6. Cultural bias are observed even in simple google search. I was completely shocked to see the results of google search done for an everyday query. A women searching for a professional hair style gave the results on the Left hand side but when searched for unproffesional hairstyles for women they came back with results on the RHS.
  7. Do you realize anything strage in this picture. This picture is an output of Google photo apps.
  8. This is Joey a student of MIT studing computer vision. The face recognition software she worked on could recognize her face better when she used a white mask. She gave a great Ted talk explaining how she is fighting algorithmic bias. She explains who code matters-how,what they code
  9. Bias can be introduced based on how the data is collected and who uses the system the City of Boston used AI technology to predict where potholes are more likely to occur. They analyze data collected from the Street Bump project, an app that allowed users to report potholes. Surprisingly, the predictions showed significantly more potholes in upper middle-income neighborhoods. Yet a closer look at the data revealed a different picture the streets in those neighborhoods didn’t really have more potholes, the residents just reported them more often, due to their more frequent use of smartphones. When ai systems only have a portion of the information needed to make correct assumptions, bias is implicitly added to the results.
  10. This is simple picture of a simple machine learning model. Data is collected and annotated and feed as training data to the model. Once the model is trained it can make predictions on any new input data it receives.
  11. Bias is introduced at all stages of this pipeline. The data used to train can ne explicity biased based on what it represents and who it omiited The process of collection and annotation can result in Sampling erros , reporting bias, selction bias and confirmation bias and so on Implicit bias can be introduced because the model was not developed by a diverse community of developers and it can further be propogated in to how the output is predicted and used. Biased data created from this process can become new training data and further amplify its effects. So you can see an AI system is capable of amplifying human biases. What can we do to address this issue?
  12. Fixing the biases in AI is a hard problem to solve because Its hard to get clean data that is free of any human bias. An Ai system that learns that there are indeed more female nurses to male nurses will always predict a nurse to be a female. The decisions made by AI machine learning/deep learning systems are not clearly understood. Univeristy of Washington developed a system to distinguish huskies from wolves, They got about 90% accuracy but on further analyzing the system they found that the model had learned to distinguish this animals based on the snow surrounding the wolves and not its individual characteristics. There is lack of diversity in the AI community causing biases to go undectected. There are very few women and people of color. If someone like Jackie was working in the photo applications group, he would have identified and caught the biases much earlier. Correcting for biases might introduce new biases and impact the accuracy of the system. A predeictive policing application could use family history along with criminal background. If we were to adjust the system for fairness by removing family history it might have an impact on its prediction accuracy.
  13. While its hard to fix biases we must take on these challenges. 1.First and foremost we need to start creating an Awareness in the community : We as Designers, developers and users of an AI system should be aware of the possible biases and its potential to harm individuals and society 2. We need to design for inclusion and diversity by providing access to the resources necessary for AI development such as datasets, computing resources, education, and training, 3. We need to work with representatives of minority communities who could be affected most so that they can participate in the design of such systems 4. We need to include opportunities for women to participate in development of AI.
  14. Our models need to be explainable and accountable Models must be capable of explaining the rational behind an individual decision like in the model developed to distinguish huskies to wolves where snow was the reason the model distinguished between huskies and wolves instead of individual animal features, we must understand the models strengths and weaknesses and be able to determine how it will behave in the future to analyse who will be impacted most by any biases in the system We need to be transparent about how the training data was collected and annotated to uncover any sampling errors or confirmation biases just as observed by the city of Boston. They solved the problem by putting sensors underneath garbage trucks that could collect data. Models, algorithms and decisions must be recorded so that they can be audited incase any unfairness is suspected. We should make available an API and any training data that allows third parties to query the algorithmic system and assess its response. Validation and Testing: WE should use rigorous testing methods to validate our models and document its results by observing existence of any bias. This involves running the model with various trail data that change input variables with various permutations and combinations and observing the output for any biases. Women in AI, AI4 All are a few organiztons trying to tackle the challenges Darpa and Optiimizing Mind are trying to make AI mor explaininable.
  15. as the AI ecosystem is still shaping up its an opputunity for all us women to play an active role in the development and +ve implications of Ai. To make the world a better place