Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Building an Equitable Tech Future - By ThoughtWorks Brisbane


Published on

At the heart of ThoughtWorks is an ambitious mission: to be a proactive agent of progressive change in the world. Aware of our own privilege, we strive to see the world from the perspective of the oppressed, the powerless and the invisible.

With QUT, here in Brisbane, we’re kicking off a series of research, projects, and conversations about the social impact of tech trends, with a view to building a more equitable tech future. Some of these topics include:

- Algorithmic accountability, transparency, bias & inclusion
- Responsible data practices (privacy and ownership of data)
- Automation and the future of work
- Data use in social media and elections
- Fake news and echo chambers
- Regulating decentralised technologies
- Blockchain for good
- End-user autonomy and privacy

Slides from: Felicity Ruby, Eru Penkman, Clayton Nyakana,
Assoc. Prof. Nic Suzor (QUT) & Dr. Monique Mann (QUT)

Published in: Technology
  • Be the first to comment

Building an Equitable Tech Future - By ThoughtWorks Brisbane

  1. 1. BUILDING AN EQUITABLE TECH FUTURE Felicity Ruby, Eru Penkman, Clayton Nyakana, Assoc. Prof. Nic Suzor (QUT) & Dr. Monique Mann (QUT)
  2. 2. It is our responsibility as technologists to understand the societal implications of emerging technology. What do we mean by an equitable tech future? TECH UTOPIA TECH DYSTOPIA
  3. 3. RESPECTFUL - Tech designed and developed equitably, respects the privacy of citizens and espouses the values of collaboration and consent. EMPATHETIC- Listening openly and deeply to people with very different perspectives, accepting the truth of those perspectives, questioning and changing your deepest assumptions about the world, and changing your behavior. INCLUSIVE - Technologists and the systems they create represent, consider and account for the diverse needs of all of society. Technologists understand and actively address inequalities, and strive to make the technology they create more compassionate and inclusive. AWARE - Technologists recognise the risks that technological advances creates, and act to reduce harm.
  4. 4. The Agile Manifesto We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value: Individuals and interactions over processes and tools Working software over comprehensive documentation Customer collaboration over contract negotiation Responding to change over following a plan That is, while there is value in the items on the right, we value the items on the left more.
  5. 5. Take a long term view Value over Revenue Global over Local Innovation over Disruption Redecentralize Modular over Comprehensive Inclusion over Efficiency Ecosystem over Ecosystem Participants Embrace the emergent Emergence over Predictability Generalists over Specialists Complexity of Truth Evidence over Opinion Discussion over Agreement Diversity over Legal Equality Perspectives over Doctrine Strive for dynamic equilibrium Trust over Enforcement Transparency over Caution Building an Equitable Tech Future
  6. 6. IT’S NOT JUST WARM FUZZIES Employees Investors Customers 55% of consumers said they will pay more for brands with a positive social impact. - Nielsen 2014 68% of global consumers would remain loyal to a brand if the organization practiced social responsibility -Martin Zwilling, Forbes. Over three years, the typical ethical fund is up by around 32%-35% compared with 28% for the FTSE All- Share index and 24% for the AFI Balanced index 56% of millennials have ruled out working for an organisation that doesn't align with their values. Deloitte Millennial Review 2016 Deloitte predict Millennials will make up 75% of the global workforce by 2025. Brands focussing on individual, industry, and collective benefits have outperformed the stock market by 206% over the last 10 years - Havas Group study
  7. 7. “A time of radical technologies demands a generation of radical technologists.” Adam Greenfield, Radical Technologies
  8. 8. Putting End-Users in Charge of Algorithms: Privacy and Autonomy ‘By Design’ Dr Monique Mann @DrMoniqueMann Dr Kylie Pappalardo @kyles_p A/Prof Nicolas Suzor @nicsuzor @JustAlgorithms @Good__Data
  9. 9. THE PROBLEM • Algorithms are ubiquitous but we do not understand or control them. • We live in a mediated world that governed, judged, and served back to us by computer code, algorithms, and data. • Although consumers contribute much of the data that algorithmic systems operate upon, they remain opaque black boxes closed off to public understanding, scrutiny and control. • The opaque use of algorithms raises human right concerns, specifically relating to individual rights to privacy and loss of autonomy. • Law is not only solution: • We need to think more about ‘Good Data’ designs • e.g. Art. 25 of GDPR data protection by design and by default • But how do we operationalise these principles and build them in practice?
  10. 10. USING DIGITAL TOOLS TO MAKE ALGORITHMIC SYSTEMS ACCOUNTABLE Automated systems shape our environment and lives And are influenced by many different actors
  13. 13. BUILDING BETTER SYSTEMS • We will conduct a series of workshops using ‘Hackathon’ formats to bring together relevant communities: • Software engineers, interactive designers, policy makers and government representatives, end-users. • We will embed a co-design approach to imagine and create new ways of: • Identifying hidden algorithmic constraints; • Participate in designing alternatives, and; • Propose technical solutions. • Our interdisciplinary approach draws on interaction design principles to create open source prototypes that: • Increase transparency and accountability; • Raise end-user awareness and understanding, and; • Reveal the inner workings of data collection and profiling applications and algorithms.
  14. 14. THE TEAM • Professor Marcus Foth (QUT Design Lab) • Dr Monique Mann (Justice) • Associate Professor Nic Suzor (Law) • Associate Professor Peta Mitchell (Digital Media Research Centre) • Dr Kylie Pappalardo (Law) We have been awarded QUT grants: 1. QUT Engagement Innovation Grant (outreach and engagement) 2. QUT Strategic Links Pilot (research and to develop ARC Linkage)
  15. 15. THE PARTNERS
  16. 16. THE PROTOTYPES • We have proposed 3 prototypes that we hope to begin co-design and co- development as ‘problem owners’ at the upcoming Aaron Swartz Day Internet Freedom Hack (9-11/11) 1. Data Cooperatives and Distributed Data Justice; 2. Re-Decentralise the Commercial Web: A Zero-Knowledge Recommendation System, and; 3. Verbose Mode for Algorithmic Transparency: Opening the Bonnet of Explainable AI.
  17. 17. THE PROTOTYPES: DATA COOPERATIVES • Re-balance the asymmetric and extractive relationship between data subjects and large corporations to achieve distributive data justice with new business models underpinned by ethical commitments. • Imagine and create a system for individual and collective data stores for collective benefit that empowers individuals to have greater control and autonomy. Individuals and communities should be able to: • Meaningfully benefit from sharing information about themselves in ways that they can understand and control; • Understand what they decide to give away, what is done with it, associated benefits (monetary or otherwise), and; • Share the value of data and the governance of the system.
  18. 18. THE PROTOTYPES: ZERO-KNOWLEDGE RECOMMENDATION SYSTEM • Personalised, de-centralised computing provides an opportunity to build a more open, diverse web that works in the interests of users where they can keep data within a personal data store and use private computing containers to run AI systems that process their data. • Develop a system that ingests a user's loyalty-card or banking data to feed a recommender system that interfaces with a comparison shopping API to present the user with personalised offers. The ideal system would enable users to: • Receive the benefits of personalisation that works in their interests; • Be sensitive to a user's revealed preferences (their history) as well as to their express intent and, • Not provide any personal data to any third party.
  19. 19. THE PROTOTYPES: OPENING THE AI BONNET • Verbose mode is a feature available in many programming and integrated development environments that allows code to be executed with human- readable explanations. • Create a new or experimental replica of existing big data and AI applications (such as urban data applications like journey planners, location-based recommender systems, or map-based / spatial data applications) with an open bonnet that: • Explicitly displays and explains to users how algorithms arrive at certain decisions or search results; • Provides additional details as to what the computer is doing, and; • Increase algorithmic transparency and reveals the inner workings of AI to users.
  20. 20. THE (EQUITABLE TECH) FUTURE • Co-host a further 2 design workshops / hackathons in 2019; • Work with ThoughtWorks to develop a social media data visualisation tool; • Make all prototypes available open source online; • Develop and publish reports, toolkits and academic articles outlining our approach and findings, and; • Submit ARC linkage to continue working towards ‘an equitable tech future’.
  21. 21. #DefendingEncryptionHack
  22. 22. #DefendingEncryptionHack AARON SWARTZ
  23. 23. #DefendingEncryptionHack AARON SWARTZ DAY Aaron Swartz Day was founded, in 2013, after the death of Aaron Swartz, with these combined goals: ● To draw attention to what happened to Aaron, in the hopes of stopping it from happening to anyone else. ● To provide a yearly showcase of many of the projects that were started by Aaron before his death. ● To provide a yearly showcase of new projects that were directly inspired by Aaron and his work.
  24. 24. #DefendingEncryptionHack INTERNET FREEDOM HACK ● Internet Freedom Hack is a series of community events that bring technologists with a passion for digital rights together to build things that advance the cause of internet freedom. ● Currently runs in Brisbane and Melbourne twice a year hosted by ThoughtWorks
  25. 25. #DefendingEncryptionHack IFH V.1 ● 17th - 19th November 2017 ● Location: Brisbane ● 5 Projects: ○ Internet Freedom Launchpad by Kai: A cloud based dynamic and disposable VPN solution that gives you a new instance everytime you use it and stores no logs. ○ DecentralisedU by Privacy Enablers: A decentralized system through which users can own and control their data! ○ L33t Sp3ak 2.0 by #!: Cryptography that's readable by humans but not by machines! ○ WhatchaNo?: A platform that will show you the posts with the worst sentiment from your social media accounts. ○ Free Elbownia! by Crypto Anarchists: A Privacy game where Agent Frankie F from Freedonia infiltrates Elbownia to extract information that will be used to destroy the "Great Elbownian Firewall".
  26. 26. #DefendingEncryptionHack IFH V.2 ● 20th - 22nd April 2018 ● Location: Brisbane, Melbourne ● 10 Projects: ○ Infinite Monkeys: Combating fake news ○ Charlie: Flight price analysis ○ Phone Case: Prevents your microphone from hearing what you say. ○ Decentralised Private Instant Messaging (DPIM) ○ Limited Information Tetration Encryption (Lite). ○ AuData: Protocol to help track the origin of a piece of information ○ eHealth record opt-out campaign. ○ FaceWhatever: An offline quiz where you can test what Facebook knows about you ○ Incognito on Steroids: Obfuscating your digital fingerprint. ○ Anti-IRA: Analyse connections between social media accounts
  27. 27. #DefendingEncryptionHack IFH V.3 ● 9th - 11th November 2018 ● Location: Brisbane, Melbourne ● Theme: Defending Encryption ● Talks: Barrett Brown, Claire Peters, Tim Wilson-Brown, Angus Murray, Monique Mann
  28. 28. #DefendingEncryptionHack How to get involved ●Visit: ●Register ●Show up
  29. 29. Limited or no software development / tech skill required ● Highlight instances of privacy zuckering and shame the perpetrators of this and/or other types of dark pattern. Follow the lead of the hall of shame page that already exists, but amplify the message! ● A camera or microphone patch for phones (something easy to use). ● Technology is political, as the Cambridge Analytica scandal has amply demonstrated. What can we do to bring more technologists or tech companies into the privacy movement or to encourage more consideration of ethics in their decision making? ● Many problems that privacy advocates rile against cannot be solved simply with technology. In a lot of cases a political or legislative push is required. How can we, using technology or otherwise, help with that push? ● Can we help to support a campaign against the government’s proposed mandatory decryption “war on maths” laws? Tech skills optional ● IsCentrelinkDown is a tool that checks the availability of Centrelink phone lines. It was created to debunk the claims of availability that are frequently made by the Department of Human Services. The source code has been opened up just in time for the Hack! ● GDPR requires corporations to publish a list of who they share data with. We could harvest and visualise this data, to try to uncover some of the data dealing that’s going on. Tech and non-tech skills required ● Collate instructions or links describing how you can delete or limit your data on datenkraken ● Visualise where the noise in Twitter for a specific topic is coming from ● Something like but for VPNs, or for other privacy tools. Or improve existing projects like or ● A video-, text-, board-, or card game to teach basic cyber security to victimised demographics Mostly tech skills required ● Harvest social media posts containing the word “fact” (or otherwise determine that posts contain a claim of fact) and put them on a stackexchange- style website where people can research and confirm or reject them (citing sources). (Talk to Robin) ● Determine your advertising profile based on a limited set of data like your Facebook likes, or Google Maps locations visited, to demonstrate the power of aggregation. ● Find and highlight “interesting” data in a download of your Facebook data or Google Takeout data. (Talk to Pam) ● Create a virtual or physical space where activists can work from a secure server with the software required to do their work. It could include access to nextcloud with collabora or cryptpad. #DefendingEncryptionHack