5. VSD – Value Sensitive Design
It was developed by Batya Friedman and Peter
Kahn at the University of Washington starting
in the late 1980s and early 1990s
9. Engineering approach: goals
• Human Rights: Ensure they do not infringe on
internationally recognized human rights
• Well-being: Prioritize metrics of well-being in their design
and use
• Accountability: Ensure that their designers and operators
are responsible and accountable
• Transparency: Ensure they operate in a transparent
manner
• Awareness of misuse: Minimize the risks of their misuse
https://standards.ieee.org/develop/indconn/ec/ead_brochure_v2.pdf
10. Future Technology Concerns
• Reframing Autonomous Weapons
• Safety and Beneficence of Artificial General
Intelligence (AGI) and Artificial Superintelligence
• Affective Computing
• Mixed Reality
11. Affective Computing
• Systems Across Cultures
Should affective systems
interact using the norms
appropriate for
verbal and nonverbal
communication consistent with
the societal
norms where they are located?
https://rowdywriters.wordpress.com/2013/11/09/understanding-cultural-differences-in-non-
verbal-communications/
12. Affective Computing
• When Systems Become Intimate
Are moral and ethical
boundaries crossed
when the design
of affective systems
allows them to develop
intimate
relationships with their
users? https://static.independent.co.uk/s3fs-public/thumbnails/image/2017/09/14/16/rexfeatures-
9058289j.jpg
13. Affective Computing
• System Manipulation/Nudging/Deception
Should affective
systems be
designed to nudge
people for the
user’s personal
benefit and/or for
the benefit of
someone else?
14. Affective Computing
• Systems Supporting Human Potential (Flourishing)
Extensive use of artificial
intelligence in society may make
our organizations more brittle
by reducing human autonomy
within organizations, and by
replacing creative, affective,
empathetic components of
management chains. http://modernpsychologist.ca/free-will-as-illusion-its-your-choice/
15. Affective Computing
• Systems with Their Own Emotions
Synthetic emotions may increase accessibility of AI,
but may deceive humans into false identification with AI,
leading to overinvestment of time, money, trust, and
human emotion.
16. Rosa M. Gil Iranzo
Thank you
very muc
s://www.linkedin.com/in/rosa-gil197206/
s://www.researchgate.net/profile/Rosa_Gil