Successfully reported this slideshow.
Your SlideShare is downloading. ×

MyData2018 - Making Responsible Technology the New normal

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad

Check these out next

1 of 45 Ad
Advertisement

More Related Content

Similar to MyData2018 - Making Responsible Technology the New normal (20)

More from Laura James (17)

Advertisement

Recently uploaded (20)

MyData2018 - Making Responsible Technology the New normal

  1. 1. doteveryone.org.uk @doteveryoneuk
  2. 2. Making Responsible Technology the New Normal
  3. 3. Many felt the internet had improved their lives
  4. 4. people are less convinced it has been beneficial for society as a whole.
  5. 5. The future Now is the time for Responsible Technology
  6. 6. Poor security, safety, and resilience of internet products and services https://cronkitenews.azpbs.org/2018/04/16/medical-device-hackers/
  7. 7. Unequal vulnerabilities to fraud, surveillance, and inequalities around pricing and access https://www.buzzfeednews.com/article/pranavdixit/big-tech-apps-for-the-next-billion-underperform
  8. 8. The impact of algorithms and AI on our lives https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
  9. 9. Agency around personal information
  10. 10. a time for responsibility in technology
  11. 11. Doteveryone is a think tank established to champion responsible technology for the good of everyone in society.
  12. 12. • We want to see responsible creation and operation of technologies in practice • We take a systems perspective, looking at technology products, services and systems and how they fit with people and society • We care about people
  13. 13. making the tech industry more responsible
  14. 14. Responsible Technology considers the social impact it might create and the unintended consequences it can cause.
  15. 15. Business models, ownership and control The business model, and the ownership and control of the organisation, and the product or service, are responsible and appropriate Employment and working conditions Inclusive employment, fair pay and conditions, including at suppliers Reward for contributions Fair reward to all those contributing information or other effort Societal impact The impact of the tech on public/societal value is positive or neutral Unintended consequences There has been consideration of systems effects, side effects, potential harms and unintended consequences Maintenance, service and support Consideration of maintenance, service and support over the long term Understandability People can easily find out and understand how the product/service works Standards and best practice Relevant tech standards and best practices, and systems design are used and evident Usability If a broad range of users are expected, or if some users may be compelled to use the product or service, it should be accessible and have appropriate support Context and Environment The context of the system, product or service has been considered and addressed appropriately; including sustainability considerations
  16. 16. Business models, ownership and control The business model, and the ownership and control of the organisation, and the product or service, are responsible and appropriate Employment and working conditions Inclusive employment, fair pay and conditions, including at suppliers Reward for contributions Fair reward to all those contributing information or other effort Societal impact The impact of the tech on public/societal value is positive or neutral Unintended consequences There has been consideration of systems effects, side effects, potential harms and unintended consequences Maintenance, service and support Consideration of maintenance, service and support over the long term Understandability People can easily find out and understand how the product/service works Standards and best practice Relevant tech standards and best practices, and systems design are used and evident Usability If a broad range of users are expected, or if some users may be compelled to use the product or service, it should be accessible and have appropriate support Context and Environment The context of the system, product or service has been considered and addressed appropriately; including sustainability considerations
  17. 17. Business models, ownership and control The business model, and the ownership and control of the organisation, and the product or service, are responsible and appropriate Employment and working conditions Inclusive employment, fair pay and conditions, including at suppliers Reward for contributions Fair reward to all those contributing information or other effort Societal impact The impact of the tech on public/societal value is positive or neutral Unintended consequences There has been consideration of systems effects, side effects, potential harms and unintended consequences Maintenance, service and support Consideration of maintenance, service and support over the long term Understandability People can easily find out and understand how the product/service works Standards and best practice Relevant tech standards and best practices, and systems design are used and evident Usability If a broad range of users are expected, or if some users may be compelled to use the product or service, it should be accessible and have appropriate support Context and Environment The context of the system, product or service has been considered and addressed appropriately; including sustainability considerations
  18. 18. What is a trust mark system? 1. a system to assess digital products and services against standards 2. a way for consumers to identify responsibly produced digital products and services
  19. 19. – Onora O’Neill “Those who want others’ trust have to do two things. First, they have to be trustworthy, which requires competence, honesty and reliability. Second, they have to provide intelligible evidence that they are trustworthy, enabling others to judge intelligently where they should place or refuse their trust.”
  20. 20. • a voluntary scheme • a mark or indicator of some sort • backed by an open evidence base • values-based criteria • business and technical aspects Trustworthy tech mark concept
  21. 21. • a voluntary scheme • a mark or indicator of some sort • backed by an open evidence base • values-based criteria • business and technical aspects Trustworthy tech mark concept
  22. 22. Prototyping a Trustworthy Tech System
  23. 23. • consumer purchasing pressure is a limited lever for change at present • many trust mark / label projects have failed or had little impact • developing a full trust mark system is a multi-year investment (standards, audit, consumer brand)
  24. 24. refining the responsible technology model and creating tools for business
  25. 25. understand and respect the different contexts and communities the technology operates within anticipate potential unintended consequences of technology examine value systems holistically; be transparent around contributions of value made and received
  26. 26. understand and respect the different contexts and communities the technology operates within anticipate potential unintended consequences of technology examine value systems holistically; be transparent around contributions of value made and received
  27. 27. understand and respect the different contexts and communities the technology operates within anticipate potential unintended consequences of technology examine value systems holistically; be transparent around contributions of value made and received
  28. 28. understand and respect the different contexts and communities the technology operates within anticipate potential unintended consequences of technology examine value systems holistically; be transparent around contributions of value made and received
  29. 29. understand and respect the different contexts and communities the technology operates within anticipate potential unintended consequences of technology examine value systems holistically; be transparent around contributions of value made and received
  30. 30. creating practical tools to embed in the consumer tech development cycle to encourage responsible development
  31. 31. Summary
  32. 32. there are no free lunches
  33. 33. there are no free lunches but people are starting to be seek out responsible products
  34. 34. whose ethics, anyway?
  35. 35. https://commons.wikimedia.org/wiki/File:Trolley_problem.png CC-BY-SA McGedddon ethics? argh!
  36. 36. move a bit slower and think about things https://www.theregister.co.uk/2018/06/15/taplock_broken_screwdriver/
  37. 37. We can engineer trustworthy digital products, services and systems, which are competently made, reliable, and honest about what they do and how they do it.
  38. 38. doteveryone.org.uk @doteveryoneuk
  39. 39. “Responsible technology is no longer a nice thing to do to look good, it’s becoming a fundamental pillar of corporate business models. In a post-Cambridge Analytica world, consumers are demanding better technology and more transparency. Companies that do create those services are the ones that will have a better, brighter future.” Kriti Sharma, VP of AI at Sage
  40. 40. @doteveryoneuk Key resources: bit.ly/ResponsibleTechLinks doteveryone.org.uk

Editor's Notes

  • hi, i’m laura
  • going to talk about what responsible technology looks like
    why we need it
    and how we are trying to get more of it
  • Doteveryone recently ran a nationally representative survey in the UK. let’s see how you compare
    how many of you feel the internet has made your life better? hands up?
    ok and how many of you feel the internet has made your life worse?
  • half the people felt the internet had really improved things for them individually
  • but many don’t see the same benefit for society as a whole

    we want technology that is useful, has benefits definitely outweighing harms, that we can rely on.
  • We look to the developers and operators of technology to act responsibly, to be honest about what they do, so that we can have confidence in their products and services. Whilst it’s easy to think it’s just an issue for the big silicon valley companies, or about personal information, responsible tech goes beyond these
  • we say responsible technology considers its social impact and seeks to understand and minimise potential unintended consequences
  • that means
    not knowingly worsening inequality
    recognising and respecting human dignity and rights
    giving people confidence in the tech

    and there’s no shortage of examples there this isn’t the case today
    here are some random ones
  • poor security and resilience, here of connected medical equipment
  • vulnerable people getting worse experiences online, worsening inequality
  • examples of AI being used for really important applications and causing harm
  • and of course issues with personal data, whether that affects individuals, or here, national security through the strava fitness data release
  • The world of technology development is maturing and, like other innovations, it’s a time for reflection, stabilisation, building good practice. It will take time and — a shift in culture and practice, a change in the relationships between tech and government and society, new ways of measuring things.

    Building technology responsibly is simply what we should be doing: in big corporates and startups, in nonprofits and communities, when building electronics or software, for infrastructure and products and toys and tools.
  • I’m from Doteveryone, we’re a think tank
    we take a systems approach in our work
    we work with government, civil society, consumers, industry…
    . there are specialist groups looking at AI or data, or even Robotics or security.
    but Doteveryone takes a more holistic view
  • We want to see responsible technologies in practice, not just theoretical frameworks - there’s many of them, ethical manifestos and oaths, etc

    We take a systems perspective, looking beyond specific niche fields. My fitness tracker is a wearable, it’s IOT, it’s a mobile app, it’s machine learning, - many things. one aspect alone won’t tell you if it’s responsible

    We don’t seek tech solutions to tech problems!
  • i’m here to talk about how we change the tech industry
    in the near term
  • Responsible practice isn’t just about the technology itself — it’s about the people who develop, manage and invest in tech, and the wider context. 
  • after researching these ideas, in early 2017, we identified 10 aspects of responsible tech. you can see how they cut across tech and business. responsibility means more than a single tech standard, or a bit of user testing, or having a diverse product team.
  • these are all the areas you should think about in a tech business. companies will do well on some, less well on others. there will always be scope to improve.
  • we considered a variety of levers for change to shift tech industry practice. A lot of tech isn’t seen as trustworthy today, often because it does badly at some of the above - it’s hard to use in some way, or stops working, or exploits workers . so We started exploring the idea of what a consumer trustmark might look like.
  • a mark would need to offer value to those who make technology, by enabling them to demonstrate their responsible practices; and value to consumers who can make more informed choices about what technologies to use.
  • to have confidence in the tech in our world, it needs to be trustworthy

    cambridge philosopher onora oneil - you need to be competent, honest, reliable
    and be able to demonstrate that

    a really powerful concept, aligned with the trust mark idea
  • we learned from other marks and schemes, aiming to be lightweight, scalable and also testable. we wanted to address the variety and complexity of digital tech, and the speed of change (with software updates - a certification at a single point in time doesn’t make sense).
  • an open evidence base could be an important part of this. A place where organisations can show their credentials, which can be checked by consumers, retailers, advocacy groups or even employees, providing accountability. which can be kept up to date, showing how products evolve. This also allows organisations to surface and talk about the grey areas, the tradeoffs, in being responsible. Because things are rarely black and white. (fairphone example)
  • so we ran a prototype with startups
    to explore how they perform across the 10 areas we defined - each week, a set of questions and exercises to reflect on and document their practice
    to see what they could demonstrate, easily, to an external evaluator.
    could we, as reviewers, feel we had a good sense of the businesses?
  • the programme went well - the companies learned new things, and most improved their practices

    they wanted to be transparent and found this process useful in that
  • the companies valued a structure to help them - our 10 aspects of responsibility was useful
    and to have a process that encouraged the teams and business leaders to think about what they were doing.
    to take a moment to reflect on a bigger picture issue, that maybe isn’t tracked day to day in normal business metrics
  • but we realised that whilst the exercises were useful, and our technology questions were unique and valuable,
    the trust mark idea didn’t feel practical for us at this time - it’s very hard to define standards, a slow journey to develop a consumer brand. Other projects have failed
    but tools for business seemed like a way we could make a difference

  • Assuming that the business side is responsible already (and there are systems available for responsible employment, ownership, governance etc - such as BCorps certification), what are the really critical components of responsibility for digital?
  • we refined the 10 aspects into 3
  • first, Technology in a wider context. beyond a single user journey. we relate differently to technologies as family members, as consumers, as citizens, as workers. Plus social and environmental impact, locally and wider scale
  • unintended consequences don’t have to be negative — maybe you come to dominate the market… security; unexpected uses; maintenance; external trends

    Consider potential consequences, minimise their impact and engage with those effected

    If this seems obvious to you, it seems that way to me too, but much silicon valley “move fast and break things” culture goes against this :)

  • contribution may often be, say, personal data in exchange for services, - but also things like mechanical turk, or “I’m not a robot” captchas helping Google to grow its machine learning capabilities. This isn’t inherently a bad thing. But there needs to be transparency
    if you are using open source, are you giving back? supporting the libraries that make your product possible?
  • together it’s a framework for businesses to work through

    We are developing tools which product teams can use to help them examine and evaluate each of these areas. everything from checklists to board games.

  • We want to create ways for product teams and business leaders to be able to think about their responsibilities to society in a concrete way that then becomes embedded within the process of creating, maintaining and updating technology.
    this isn’t an accountability system, it’s a way to make best practice easier for teams day to day, based on needs we’ve encountered
  • responsibility isn’t a free lunch – it takes work or sacrifice. Consider an organisation struggling between good user experience and viral effects their investors want to see to give highest growth. knowing what is right is not always obvious, and knowing what to do about it even less so.

    it’s hard when - your competitor may be moving fast and breaking things
  • but customers are starting to look for responsible products where they can.

    It’s easier for organisations starting out to be responsible; their values, the people they recruit, the customers and investors they target are more aligned.
  • Doteveryone are defining responsible tech with a european mindset, based on our values as an organisation.

    They are not global values or ethics. It’s not the view of silicon valley, or China, for instance. May not work everywhere
  • remember being responsible doesn’t require complex abstract philosophical ethics in most cases.
  • we can do well with basic common sense, thinking through risks and planning sensibly. (If you are designing a lock, think about how malicious people could open it!)
    Incidents of bad design affect the perception of tech as a whole. We need to do better at calling out silly mistakes early, helping each other to build better products, learning together
  • We can’t engineer people’s trust. That would be manipulative – and sometimes people are right not to trust some technologies.

    We can engineer trustworthy digital systems, competently made, reliable, and honest about what they do and how they do it.

    It is an individual responsibility on each developer, designer, leader.

  • through customers, investors, employees and government all demanding better development practice – things will change.

    We don’t need regulation. We can have governance through open technical standards; improved practice sector-wide through templates for better design, tools to evaluate impact and think through risks.
  • Pioneering companies and responsible businesses are already doing this.


    this is the start of a wider movement, and we can all play a part in it.
  • thank you

×