2. WHAT IS COMPUTER
SCIENCE
Is the study of computers and computational
systems. It is a broad field which includes
everything from the algorithms that make up
software to how software interacts with
hardware to how well software is developed
and designed.
2
4. By exploring examples of ethical issues in
computer science, it is possible to gain a
deeper appreciation of the significant role that
ethics plays in this evolving field. The
examination of these examples can provide
valuable insights and learning opportunities,
helping you perceive these issues from multiple
perspectives and understand the potential
outcomes of ethical and unethical practices in
computer science.
5. THE FACEBOOK
The Facebook-Cambridge Analytica
Scandal:One of the most prominent examples of
ethical breaches in recent years is the Facebook-
Cambridge Analytica scandal. In this case,
Cambridge Analytica, a political consulting firm,
collected data from approximately 87 million
Facebook profiles without the explicit consent of
the users. This was used to target voters with
personalised political advertisements in the 2016
US presidential election, thereby, crossing ethical
boundaries regarding privacy and informed
consent.
5
6. ARTIFICIAL
INTELLIGENCE
Google's Project Maven and AI Ethics:Pentagon's Project
Maven involved using artificial intelligence to analyse drone
footage. Google's involvement in the project invited internal
and public criticism, leading to several employee
resignations. The key ethical concern revolved around AI's
use in military applications. The outcry resulted in Google
not renewing the contract and later releasing a set of
guiding principles for its AI projects, stressing on avoiding
uses that could harm or deceive people.
6
7. DEEPFAKE TECHNOLOGY
Unethical use of Deepfake Technology:Deepfake
technology involves using AI algorithms to manipulate or
fabricate video content, making it appear real. Deepfakes,
while showcasing AI's prowess, present severe ethical issues
about disseminating misinformation and deceptive content. For
example, in 2018, a manipulated video of former US President
Barack Obama was released as a public demonstration of the
technology's dubious potential. The ethical dilemmas here
involve digital consent and the harmful consequences of such
deceiving content.
7
8. LEARNING FROM PAST
ETHICAL MISTAKES IN
COMPUTER SCIENCE
By understanding past ethical mistakes in computer
science, it becomes possible to prevent similar
infractions in the future. As the saying goes, "Those
who do not learn from history are doomed to repeat
it", reflecting on past mistakes equips us with the
wisdom to foresee and manage upcoming
challenges. Let's deliberate some learning points
from the ethical missteps of the past:
8
9. NOT PRIORITIZING USER
CONSENT
• Not Prioritising User Consent: The
Facebook-Cambridge Analytica scandal
highlighted the crucial importance of gaining
user consent before collecting, using, or
sharing their data. It's a poignant reminder to
always place user consent at the forefront
and adopt transparency in data handling
practices.
9
10. OVERLOOKING SOCIAL
IMPACT
• Overlooking Social Impact: Google's
participation in Project Maven served as a
wake-up call about considering the social
implications of unleashing advanced AI in
sensitive domains like military and
surveillance. Today, Google's AI Principles
underscore the importance of incorporating
societal benefit as a key factor in its projects.
10
11. UNDERESTIMATING
MISUSE OF TECHNOLOGY
• Underestimating Misuse of Technology: The case
of deepfakes underscores that while technological
advancements can be impressive, their potential
misuse can lead to significant harm. It reminds us to
consider and plan for potential misuse when
developing new technologies.
11
12. 12
Ethical Mistake Learning Point
Not Prioritising User Consent
Always gain user consent
and practice transparency in
all data handling practices
Overlooking Social Impact
Consider the broader social
implications of developing
and deploying new
technologies
Underestimating Misuse of
Technology
Predict and plan for potential
misuse when developing
new technologies