Successfully reported this slideshow.
Your SlideShare is downloading. ×

Should We Design This?

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 37 Ad

Should We Design This?

Download to read offline

Our online morgenbooster: Should We Design This?

Taking a perspective on ethical design, we will take a look at human implications of design.

Our online morgenbooster: Should We Design This?

Taking a perspective on ethical design, we will take a look at human implications of design.

Advertisement
Advertisement

More Related Content

Advertisement

Should We Design This?

  1. 1. Should we design this? Online Morgenbooster 1508.dk/morgenbooster We start at 9:00 #111
  2. 2. Should we design this? Speakers Mikkel Jespersen CEO & Co-Founder 1508 Christian Bason CEO Danish Design Centre
  3. 3. Why are Ethics important – and why now? Ethical design in a digital context The Digital Ethics Compass Future perspectives Our focus today

Editor's Notes

  • Let’s start with some light Ethical dilemma workout.

    Railroad switch

    Forestil dig, at du er togfører på et tog, som er på vej til at køre ind i en tunnel. Inde i tunnelen står der fem sporarbejdere. Pludselig går bremserne i toget i stykker, og toget kører så hurtigt, at de fem sporarbejdere ikke kan nå at slippe væk. De vil med sikkerhed blive dræbt af toget.

    I sidste øjeblik opdager du et fjernstyret sporskifte, som du kan skifte med en knap i toget. Hvis du trykker på knappen, kører toget ind i en anden tunnel. Inde i denne tunnel står der kun én tunnelarbejder, som med sikkerhed vil blive dræbt.

    Du ved ingenting om arbejderne.

    Trykker du på knappen?
  • On a Global Scale the Cambridge Analytica scandall put an emphasis on the evil power tools that can be created by combining large amount of data, ingenious algorithmes and behavioural design.

    Suddenly we realised that tools like that can influence referendums and elections – or any other democratic process.

    Cambridge Analytica was not a case on a sideeffect no one thougt of.
  • But there are numourus examples on exactly that. A sideeffect no one had thought of.

    Strava is a popular fitness tracker and an online community used by millions of people around the world.

    The Strava app takes data from fitbits and smartphones, uploading the information to a "heatmap" for users to track their workouts and share with friends.


    Here are some of the biggest reveals from a fitness-tracker data map that may have compromised top-secret US military bases around the world


    Here's Camp Lemonnier, a US military base in Djibouti, along with a possible CIA site nearby.
  • An ethical dilemmas are not just for Big Tech or advanced Scale-ups from Silicon Valley.

    I you have been following danish media for some time, you propably will have noticed, that Nemlig.com has gotten into problems. For years nemlig .com has beed celebrated as the most well functioning digital service within Grocery delivery – and with a very flexible and competitive value proposition.

    It is now clear to most people that it can come with a prize if you build a service and a business model on data and metrics that can be tracked and optimized to an extend where employees and partners suffer.
  • In some cases we have heard the argument ‘ Well it was the AI or the Algorithme who is to blame’. But Algorithms are made by people. And in the end. Wether you are a business owner, a developer or a designer – you got the responsibility. And that is the reason why we want to adress Design Ethics
  • Today, technology companies can manipulate billions of people's lives, moods, friendships and wallets through the design of their products or services. New products can benefit humanity, mislead and manipulate consumers, make people addicted to digital gadgets and services, and influence political attitudes.

    Examples:
    Cambridge Analytica Scandal
    IBM facial recognition software that couldn’t identify black woman with the same accuracy

    I don’t think anyone doesn’t want to be ethical. It’s part of our danish / scandinavian design DNA. But we seem to be struggling with including ethics as a natural and integrated part of a design process. We lack the proper language, tools, and processes for correcting a misleading path. Something that can help them calculate the consequences of ignoring the ethical issues in the early stages of developing a service or a product. Everyone has an ethical responsibility but businesses can make it easier or more difficult. We want to give businesses a language to discuss ethics within their industry. We believe this is a crucial first step in the process of creating more ethical digital design. And furthermore, we believe businesses can actually strengthen their competitiveness when having a proactive approach to ethics and find the way to integrate ethical thinking and discussions into the design process.

    And basically, we want to ensure that digital products and services continue enriching our lives. Because we love digital design. We just have to put the human back at the centre.
  • It’s safe to say that in this landscape, trust has been challenged and even deteriorated. And a lack of trust can be poisonous for businesses - as the aforementioned cases also show. So there’s a need to build a new level of trust towards users and humans. Trust will drive a healthy market and society.
    There’s an opportunity space to be more ethical and make ethical digital design a competitive advantage. Danish companies can fill this gap and gain that competitive advantage through ethical design, and activating the Danish Design DNA.
    There’s a need to develop a language, processes and tools for ethical design. We can have good and ethical intentions but if we don’t have the tools to practice ethical design we won’t get anywhere.
  • It’s safe to say that in this landscape, trust has been challenged and even deteriorated. And a lack of trust can be poisonous for businesses - as the aforementioned cases also show. So there’s a need to build a new level of trust towards users and humans. Trust will drive a healthy market and society.
    There’s an opportunity space to be more ethical and make ethical digital design a competitive advantage. Danish companies can fill this gap and gain that competitive advantage through ethical design, and activating the Danish Design DNA.
    There’s a need to develop a language, processes and tools for ethical design. We can have good and ethical intentions but if we don’t have the tools to practice ethical design we won’t get anywhere.
  • Some Danish companies have gradually begun to understand that behaving ethically and adequately is a business-critical necessity to survive as a company in a new reality. It is increasingly a consumer requirement that products and services create security and transparency for users.

    But what does that look like? In the project the Digital Ethics Compass we identified a lack of ethics as a thorough part of the product and service development process, to consider and act upon implications of ones digital product or service on humans. We identified a need for teams in tech companies to train their ethical muscles and get a language to speak about it.
  • When we talk about ethical design in a digital context, we talk about it within three themes; automation, behavioural design and data.

    Automation revolves around transparency and explainability of algorithms. Algorithms are increasingly being used to make decisions with consequences for human lives. It can be a loan application rejected by an algorithm or a self-driving car that is driving wrong. In the worst case, poor automation ethics can create a world where people are exposed to unjust, inhuman or incomprehensible decisions that they cannot subsequently explain. From a societal perspective, poor automation ethics can further shift the balance of power between businesses and consumers to the benefit of companies, creating inefficient markets, monopolies and dissatisfied consumers.

    Behavioural design is about digital designers and developers getting better at using design methods to manipulate people's brains. And combining knowledge of behaviour with algorithms and substantial data sets, you can become so good at influencing and manipulating people that it becomes hyper-manipulation, where you can essentially remotely control people to perform specific actions and think certain thoughts. Behavioural design and nudging are often highlighted as a method that helps people make better, healthier and more intelligent choices. Worst case, poor behavioural design can lead to digital addiction and manipulation of users, which in turn can lead to passivity or digital rebellion, with people opting out of the digital world altogether.

    Within data we talk about behaving correctly (and legally) when collecting and using personal data. Acting ethically within data means protecting people's need for privacy in a digital world where data is becoming more and more valuable. In the worst case, poor data ethics leads to people feeling monitored and deprived of control over their own lives. And from the companies' perspective, the long-term consequences of poor data ethics are that customers lose confidence in companies and digital services in general.
  • We have further defined some core principles for digital ethical design that can be applied by any company within any of the three before mentioned themes.
  • For us, the core of ethical (digital) design is about putting the human in the centre. This is not just the user, but anyone who in one way or another is affected by the digital solution. It sounds banal, but often the human being, whom it’s all for, can slip into the background when designing digital solutions.
  • Avoid manipulating: Often digital solutions are designed to help people by making decisions on their behalf. But this can undermine people’s very basic urge to decide over their own life.

    Make your technology understandable: Digital solutions are often complex and difficult for ordinary people to understand. But that’s why designers still have a responsibility to make the digital solutions understandable and transparent to let users comprehend how the solutions work and affect their lives.

    Avoid creating inequality: Digital design without thought for ethics can often end up perpetuating and reinforcing existing inequalities in society.

    Give users control: Digital solutions may well help people and make their lives easier, but they must not leave people with a sense of losing control. Solutions should be designed to give people more and not less control.


  • These ethical principles are not simple guidelines that one can just follow. They all contain a built-in conflict to which there are no unambiguous answers. For example, “give users control.” A lot of digital design is all about automating and streamlining processes, which means that you take control from the users and move the control over to the digital solution. But when does the loss of control become an ethical problem, where one no longer acts in the users’ best interest but instead makes them helpless and alienated? And with avoid manipulating, when is manipulation a good thing and when does it become problematic?

    These are difficult questions that should be considered in the digital product/service development.
  • As mentioned, Danish companies have gradually begun to understand that behaving ethically and adequately is a business-critical necessity to survive as a company in a new reality.

    Let’s take a look at examples within automation, behavioural design and data.
  • [A good example of ethics within automation] Corti is a Danish company that has developed a machine-learning algorithm that listens to emergency calls. The algorithm can detect if a cardiac arrest is happening based on the breathing patterns of the caller. In this case, an ambulance will arrive more quickly.

    The algorithm from Corti has learned from data from old emergency calls, which means that less common dialects are underrepresented. Worst case, this could mean that the algorithm discriminates against these dialects and detects fewer cardiac arrests in areas where this dialect is present. Corti continuously checks the algorithms for bias, and in the case of the dialects, they have chosen to train the algorithm on several specific emergency calls of different dialects.
  • [A good example of ethics within behavioural design]

    Tobi is a Danish startup that helps parents invest their children’s savings. Their message is that it is far more efficient to invest their child’s savings than to leave it in a regular bank account. They use comparisons to show what this difference can mean when the children turn 18, but they do not use fear in these messages. They use concrete examples, but they do not tell stories about how horrible it will be to be 18 years old without child savings in a world where a two-bedroom apartment in Copenhagen will cost 10 million DKK.
  • [A good example of ethics within data]

    DuckDuckGo is an alternative to Google’s search engine that does not collect user data or track user searches on the web. DuckDuckGo does not even know how many users they have, as they do not track users. Because DuckDuckGo does not store information about users, users will only see ads related to their current search. When using DuckDuckGo, it is clear to the users that they aren’t being tracked across web pages and that the search engine is not collecting too much data about their movements online.
  • So if it’s not about rules and black and White – how can we work with design ethics in a structured way? And how can a single designer, a team, a department or the whole organisation get started?

    DDC is the initiator of a project, that setting of to investigate that and come up with practical solutions. They have joined forces with the digital strategist Peter Svarre, and a group of good people.
  • B&O + Watcher-historie
  • Healthcare Technology
    Corti has developed a decision support tool for Medical professionals like doctors and nursess. For instance in Emergency hotline. The algorithme has been trained through more than 100.000 emergency calls. It s an AI Listening to detect critical illness such as Cardiac Arrest or Stroke.

    Corti will listen in on the conversation an propose questions to clarify possible diagnosis.

    Corti has developed a decision support tool for Medical professionals like doctors and nursess. For instance in Emergency hotline.


    Corti will listen in on the conversation an propose questions to clarify possible diagnosis.

    The algorithme has been trained through more than 100.000 emergency calls. It s an AI Listening to detect critical illness such as Cardiac Arrest or Stroke.



  • When asking the Danish design field, it’s agreed that ethics will play a key role in the future of design
    It is evident that ethics can be a part of the Danish and nordic design DNA, and therefore looking globally, it can become a competitive advantage. The Danish tech ambassador located in Silicon Valley also confirms how responsible technological solutions is a strong positioning for Danish and nordic companies alike. Could we have an influence outside of EU and push for an ethical agenda in other continents such as USA and Asia (China)?
    There’s a huge potential in the public sector for digital design ethics. Technology and data is moving closer to the citizen and creating new opportunities and demands for municipalities, regions and the state. How do we ensure this digitisation of the public sector has a positive impact and contributes to a healthy digital society? If we can even define what a healthy and good digital society is.
    Humans and machines are melting together. They are becoming interdependent which can be incredibly beneficial but can also go incredibly wrong. How do we use technology to the advantage of humans?
    How do use technology to the advantage of the planet?

×