Artificial intelligence in the post-deep learning era
Thingscon 2018 - the meaning of Responsible Technology
1. the meaning of
responsible technology
Dr Laura James
@LaurieJ
doteveryone.org.uk / @DoteveryoneUK
www.trusttech.cam.ac.uk / @CamTrustTech
@Cambridge_CL
@MaintenanceFest / @TheImpactUnion
11. • We want to see responsible
creation and operation of
technologies in practice
We take a systems perspective,
looking at technology products,
services and systems and how
they fit with people and society
12. Business models, ownership and control
The business model, and the ownership and
control of the organisation, and the product or
service, are responsible and appropriate
Employment and working conditions
Inclusive employment, fair pay and conditions,
including at suppliers
Reward for contributions
Fair reward to all those contributing information or
other effort
Societal impact
The impact of the tech on public/societal value is
positive or neutral
Unintended consequences
There has been consideration of systems effects,
side effects, potential harms and unintended
consequences
Maintenance, service and support
Consideration of maintenance, service and
support over the long term
Understandability
People can easily find out and understand how the
product/service works
Standards and best practice
Relevant tech standards and best practices, and
systems design are used and evident
Usability
If a broad range of users are expected, or if some
users may be compelled to use the product or
service, it should be accessible and have
appropriate support
Context and Environment
The context of the system, product or service has
been considered and addressed appropriately;
including sustainability considerations
13. Business models, ownership and control
The business model, and the ownership and
control of the organisation, and the product or
service, are responsible and appropriate
Employment and working conditions
Inclusive employment, fair pay and conditions,
including at suppliers
Reward for contributions
Fair reward to all those contributing information or
other effort
Societal impact
The impact of the tech on public/societal value is
positive or neutral
Unintended consequences
There has been consideration of systems effects,
side effects, potential harms and unintended
consequences
Maintenance, service and support
Consideration of maintenance, service and
support over the long term
Understandability
People can easily find out and understand how the
product/service works
Standards and best practice
Relevant tech standards and best practices, and
systems design are used and evident
Usability
If a broad range of users are expected, or if some
users may be compelled to use the product or
service, it should be accessible and have
appropriate support
Context and Environment
The context of the system, product or service has
been considered and addressed appropriately;
including sustainability considerations
14. Business models, ownership and control
The business model, and the ownership and
control of the organisation, and the product or
service, are responsible and appropriate
Employment and working conditions
Inclusive employment, fair pay and conditions,
including at suppliers
Reward for contributions
Fair reward to all those contributing information or
other effort
Societal impact
The impact of the tech on public/societal value is
positive or neutral
Unintended consequences
There has been consideration of systems effects,
side effects, potential harms and unintended
consequences
Maintenance, service and support
Consideration of maintenance, service and
support over the long term
Understandability
People can easily find out and understand how the
product/service works
Standards and best practice
Relevant tech standards and best practices, and
systems design are used and evident
Usability
If a broad range of users are expected, or if some
users may be compelled to use the product or
service, it should be accessible and have
appropriate support
Context and Environment
The context of the system, product or service has
been considered and addressed appropriately;
including sustainability considerations
15. Business models, ownership and control
The business model, and the ownership and
control of the organisation, and the product or
service, are responsible and appropriate
Employment and working conditions
Inclusive employment, fair pay and conditions,
including at suppliers
Reward for contributions
Fair reward to all those contributing information or
other effort
Societal impact
The impact of the tech on public/societal value is
positive or neutral
Unintended consequences
There has been consideration of systems effects,
side effects, potential harms and unintended
consequences
Maintenance, service and support
Consideration of maintenance, service and
support over the long term
Understandability
People can easily find out and understand how the
product/service works
Standards and best practice
Relevant tech standards and best practices, and
systems design are used and evident
Usability
If a broad range of users are expected, or if some
users may be compelled to use the product or
service, it should be accessible and have
appropriate support
Context and Environment
The context of the system, product or service has
been considered and addressed appropriately;
including sustainability considerations
57. We can engineer trustworthy digital
products, services and systems,
which are competently made,
reliable, and honest about what they
do and how they do it.
58. •talk about the grey areas, with nuance,
humility & empathy
•champion responsible tech in your world
•move a bit slower and think about things
59. •talk about the grey areas, with nuance,
humility & empathy
•champion responsible tech in your world
•move a bit slower and think about things
60. •talk about the grey areas, with nuance,
humility & empathy
•champion responsible tech in your world
•move a bit slower and think about things
Hi, I’m Laura
I work at Doteveryone, and the Trust and technology initiative, and I have a couple of side projects.
first of all, i’m talking about responsible technology.
but i don’t actually like that phrase
technology is an object
it cannot be responsible
people can be responsible.
and so I think about making the tech industry more responsible, getting more responsible practice in how technology is developed
still, it’s a good hashtag.
nearly 11 years ago, I shipped a connected home system. We tried hard to do the right thing. we thought about what would happen if networks failed. how to secure a system and do key exchange. how people could sell gear on eBay later on.
I have become very frustrated with seeing poorly designed IOT systems in the years since. systems that assume only one person lives in a house and wishes to control things. Systems that break when network coverage isn’t perfect. dreadfully insecure systems meant for use in the most intimate parts of our lives.
how could we as a sector build so badly, when it was possible to do well? we knew how to do this stuff; but it wasn’t done.
there are so many examples of irresponsible. particularly but not exclusively in IOT. devices made with no serious plan to keep them operating. insecure products. bad user interfaces. unreliable devices. dark patterns to extract data or exploit users. biased machine learning. or what is perhaps worse, machine learning vastly over claiming what it can achieve, or not offering an improvement over a manual system.
the fears and distrust arising from the Silicon Valley backlash will affect all of us.DOn’t imagine that just because people still use facebook that they don’t care about the problems with it. or even that they trust it. as the tech industry, we need to do better and be seen to do better.
people look to the developers and operators of technology to act responsibly, to be honest about what they do, so that we can have confidence in their products and services.
when IOT and other internet gadgets were just toys this was less of an issue. Now the internet is in all parts of our lives and societies, and the problems it can cause are not just online. Whilst it’s easy to think it’s just an issue for the big silicon valley companies, or about personal information, it’s not. it is about all technology.
The world of technology development is maturing and, like other sectors at this stage, it’s a time for reflection, stabilisation, building good practice. It will take time and — a shift in culture and practice, a change in the relationships between tech and government and society, and new ways of measuring things.
Building technology responsibly is simply what we should be doing: in big corporates and startups, in nonprofits and communities, when building electronics or software, for infrastructure and products and toys and tools.
At Doteveryone, I started exploring what responsibility meant in technology development in 2016. What does good look like? what does bad look like?
everyone’s talking about data, privacy and machine learning. but that’s just the current hype cycle, it’s not the whole thing.
a few quick points about our work -
Doteveryone are defining responsible tech based on our values as an organisation. with a european mindset,
They are not global values. It’s not the view of silicon valley, or China, for instance.
It’s worth noting that we don’t talk much about ethics - partly because ethics is so personal and culturally specific, and changes over time. and so many of the issues which frustrate us about tech today are not really about ethics - just straightforward bad practice.
Doteveryone wants to see responsible technologies in practice, not just theoretical frameworks - there’s many of them, ethical manifestos and oaths, etc (we made a database of those this year - so many! so ignored in practice!) we don’t need more theory now.
We take a systems perspective, looking beyond specific niche fields. My fitness tracker is a wearable, it’s IOT, it’s a mobile app, it’s machine learning, - many things. Responsibility cuts across these domains.
this slide has far too many words on it.
I defined a landscape - 10 aspects
overlapping; some more technical, some more business. you can’t separate these out - what you do, why you do it, how you make money - all intertwined.
and there’s a lot hidden in here. for instance, ‘standards and best practice’ carefully elides over a whole bunch of specifics, depending on sector and product or service type. the idea is to show different parts of a tech development organisation that impact overall responsibility, and to raise awareness of the different ways your organisation could be responsible. Or irresponsible.
some of these may seem obvious to you. making usable products, that are understandable. doing design. But it turns out these aren’t obvious to everyone. There are many developers not in this room who simply don’t think like this. Just last week I was talking with someone who is active in, well, let’s not name them, a specific industry tech network near where I live. And he said that user centric design - just thinking about users at all - just isn’t something that community thinks about. And yet to me that’s just obviously what you do - you look at who will use your product and what for and you design for that. My background, the engineering training i’ve received, the cultures I’ve worked in, are just fundamentally different to many others. It’s important to remember our diversity within tech. in all the forms diversity can take - both visible, and hidden.
some of these are more radical. designing a business model and organisational ownership and control structure that are responsible and appropriate for the product - you may only be able to do that if you are starting out. It implies you might think about where investment comes from and what you give back in exchange, and what that might motivate or mean. Fair reward for contributions - including of data. do the people who supply data for your service get fair value in return? do you help public data suppliers improve their data by submitting bug reports? do you give back to the open source projects you use? do you really fairly compensate anyone providing micro labour, such as mechanical turk, which supports your service? if people create content and you benefit from that, do they really get a fair return?
if you dive into these in more detail - and we don’t actually go very detailed, these are to make you think, not to give you an instruction set - you can see some subtley. Perhaps your product will never be ‘understandable’ to the people who use it every day - because it’s complex and technical.
But you can be clear and honest for them; and perhaps understandable to consumer groups, who might be recommending or reviewing your product; or understandable to regulators, who can see how your system operates because you are being transparent (and maybe compare it to others, and know what’s possible)
you can see there’s not one correct answer. but the point of responsibility is to think about these things, to avoid the worst abuses, to do what you can to do stuff right, to be open and honest about what you do.
so some doteveryone perspectives:
Responsibility is a scale: there are many trade offs. we can always do better.
Complexity needs to be embraced: technology itself is complex. the wider world is complex too. We needs to be more sophisticated in our day-to-day business operations, in the ways we talk about and map the trade-offs of technology, as well as accepting that there will always be conflict and we won’t always get it right, but we need to try - and then try again
For the most part, people don’t want to create harm. If given the opportunity to reflect, raise concerns, people will be responsible. Up to a point.
transforming the tech industry. we’ve been changing and evolving how we work since human organisations were invented, and this is no different. We now need to embed new practices and new ways of working in order to be more responsible, and this requires long-term commitment, visible and inspiring leadership, common language and understanding, and enthusiastic and smart people who want to learn. luckily, that’s what the tech sector says it is like ;)
finally, where technology is introduced in a thoughtful and consultative way, the risks of harm are reduced and everyone benefits.
but how do we move the dial? there is no perfect
even if you are working on a really important and meaningful tech4good project
in a nonprofit, say, with great stakeholder engagement
you won’t get everything right.
we should note it’s easier to criticise (and help) projects that are not ‘for good’. projects in sustainable development or humanitarian aid don’t always get everything right, and it’s hard to point out where they might be making mistakes, because you know they are struggling, so you don’t weigh in… and so problems can last longer than they should.
more importantly though, the devil is in the details. nothing is black and white. most decisions are complex, grey areas.
plus there will always be a scammer, people trying to get rich quick, even criminal. we hope they don’t dominate but will always cluster around power, and that’s tech right now
There are big questions, such as whether your business model’s use of data is responsible, whether you are giving fair value to those who contribute effort or data to your service. then there are little questions, such as whether you actually implement the security feature or comply with the voluntary standard, when that might delay your ship date, and if you miss that, you lose out on funding or revenue, and then the company might collapse.
all of this isn’t easy. but it’s important. people’s experience with technology affects their whole lives; it changes how they feel about the internet, about the people who make technology.
Doteveryone recently ran a nationally representative survey in the UK. let’s see how you compare
how many of you feel the internet has made your life better? hands up?
ok and how many of you feel the internet has made your life worse?
half the people we surveyed felt the internet had really improved things for them individually
but many don’t see the same benefit for society as a whole
people want technology that is useful, has benefits definitely outweighing harms, that we can rely on.
we need to work at this. we can’t just react to scandals, when they occur.
we need a systems change approach. Who are the players in the tech ecosystem, whose actions could change the way we design and develop technology? Bear in mind that different actions will be appropriate on different timeframes, too. some change can be quick, some will be slow.
people power - the voice of the customer.
building digital understanding, a critical approach. not just basic digital skills. even with that, it’s a lot to expect real people - with busy, complicated and messy lives - to choose responsible tech sometimes though. cheap, convenient irresponsible tech has a huge appeal. So many of our internet services are great for us as individuals, even if they are less good for society or communities overall. so it’s a tough choice, and one we cannot expect everyone to make, given constraints of energy, money and time. not everyone is a digital hipster, buying their fair phones, paying for their apps, fiddling with open source.
so there’s also a role for enabling collective action, empowering civil society, the social sector, to demand and use better tech. and to push back on irresponsible tech and raise complaints.
then we come to the people who make technology.
we venerate technical teams
obsess over heroic technical leaders - the Musks, Zuckerbergs, and so on
but we need to remember that with great power comes great responsibility
personal ethics of the makers of technology will drive change for some. As we’ve heard here, there’s more awareness of tech ethics than before. that’s great. but I don’t think that personal motivations alone will make significant near term change.
you have to know what you do. and then, you have to DO something about it. do you challenge your manager when they ask for something that looks like bad practice? do you quit? or do you ship the insecure product because that’s what the company needs? Not easy.
even if you care, not everyone knows what good looks like. many technical people haven’t had social science or design training. so we need tools and practices - such as those doteveryone is developing, I will come back to them.
power as a workforce - google walkouts. depend on the privilege of tech workers to leave - not everyone can turn down the big salary, or walk away from it. and money is addictive.
and there’s the Dark matter in the tech workforce - people who just aren’t aware of, or interested in these issues
how we educate developers, engineers, designers - do we need more ethics training? tricky, as many people come to digital from other careers; we don’t all have degrees. Do we want them to demonstrate high standards through professional certification, annual checks? [i’m unusual in that i am a chartered engineer; but i don’t see tech jobs caring about that. also my career is clearly unusual, as I strongly weight working on things i can believe in]
business models drive what happens in businesses. they create the culture that supports and motivates more or less responsible behaviour.
business certifications - BCorps, responsible100 - programmes for businesses to demonstrate their good practice. These aren’t perfect either - BCorps doesn’t have much tech content. but it forces change throughout an organisation and reduces ethics washing through audits and checks.
businesses are also sometimes customers of tech - and they can play a role in demanding better practices. here we might see more progress - businesses want to manage risk, they are happy for suppliers who adhere to standards, they don’t want unexpected costs.
investors. tech businesses often need risk capital. especially if you are building hardware. what can you offer in return for this? what will your investors expect, and when?
we should remember that funding can come from the state as well as private hands. crowdfunding looks good, but doesn’t usually cover the real costs of setting up production.
I’m actually encouraged here, there are new kinds of investors looking at the tech scene. The sort of investors that don’t want to make a quick buck, but will support longer term growth. Or who don’t want an exit - and there’s a driver which is not always responsible, it depends on what your business does. Investors that take a cut of profit instead of wanting their capital at exit, for instance. Steward capital, indie vc and more.
and of course bold policy making. Not just the kind of regulation everyone dismisses as stifling innovation, but well designed interventions.
demanding the use of standards. designing markets to avoid domination or bad practice. we need this - safety and security are things the market is simply not going to provide effectively in the complex tech space.
regulators don’t have to be slow moving and ill informed about tech. Many are not - they use innovative techniques, sandboxes for instance; they hire or second technical experts.
I’ve heard people here complaining about government. yes, they need greater digital understanding. we need to help them learn, take secondments and opportunities to support their work.
we also need to go to their spaces, to their communities, and to think about the constraints and challenges they face.
if you are frustrated by your local or national government, remember it is your government. you can be a public interest technologist, as bruce schneier puts it, working in government, in the public sector, or actively in civil society making change. Don’t just moan that they are slow. Society is also slower than tech. that’s just how it is.
OK, on to a specific - Doteveryone has been working on how to improve industry practice.
We refined the 10 aspects I showed you into 3, punchier areas, focussed on the tech side, rather than the responsible business practice stuff. things like paying minimum wage, considering the conditions of contract staff and the ways subcontractors work. (which is well covered elsewhere)
now we have the 3 Cs model, focussing on different ways to think about what you are building. we expect this to fit into development processes, perhaps into an agile cycle, or as part of periodic business review.
so, context - thinking about the bigger picture, of where and how and when the tech is used, who by.
it can be useful to think about this through different lenses reflecting the different places technology fits in people’s lives - and the role of groups as well as individuals
for this, we encourage product teams to think about the context in a variety of ways, and we can provide or point them at tools which help.
inclusive design personas, contextual user journeys, technical trade offs
consequences - what might happen when your technology is out in the world.
people tend to focus, rightly, on ‘unintended negative conseuqnces, but we should also remember that success can be a consequence and should be planned for. What if you can’t scale your support team? what if your factory lets you down? what if you come to dominate your market, but still don’t make money?
again there are several ways to think through potential consequences.
effects on the environment, through energy or materials; effects on workers
resilience, security and reliability - can you trust your suppliers? what about your digital supply chain? nothing worse than an IOT system going offline because some incidental service goes down. like O2 yesterday
Contribution is about considering all of the ways value can be contributed and shared through a tech product or service. the exchange of value taking place within your technology product or service should be shared publicly and understandably. And ideally should be fair.
Value being contributed in the creation and maintenance of your technology in this sense could mean:
Formal microlabour such as Mechanical Turkers or content checkers
Informal labour such as community moderators or users completing captchas
Information such as datasets, open source code, sensor data, personal information, and so on
Value being shared includes: The value you receive from users that supports your business model; the value you offer them, and The value you contribute to your community and/or the world
so We have created a structured approach to creating responsible technology that considers the 3Cs; we’ve prototyped these with a range of businesses to see how they work in practice. and we’ve found that reflection and a framework are really helpful, even for businesses which consider themselves responsible.
and we are developing this into proper tools for business
we have this business sales pitch now
we need arguments to make to more profit-minded business.
about risk mitigation, in particular, which companies understnad. and crisis management - no one wants to be Facebook this year, reacting so poorly to problems.
to get businesses who aren’t in this room to be repsonsible, we need to make cases which appeal to them. like these arguments here - it might not be why we are responsible but it’s how we can get others to be. we’re are starting to build evidence to demonstrate these.
pioneering companies will adopt first, before hopefully wider sector adoption. companies won’t do everything right and they’ll make horrible mistakes, but at least it gets on the agenda at board level and below.
aside from business process change, we
can do a lot by being thoughtful engineers, designers, leaders. pursue meaningful projects, do stuff right, talk about hard stuff, work together to find solutions
but a lot of what is irresponsible seeming today is incentives, motivations, money. in biz, in world.
we can enable motivated individual contributors in the tech world to work to higher standards, motivate better performance in their workplace, and call to account unethical or irresponsible practices.
but we also need to change what is valued and measured in tech, or in some of tech - showing alternative ways of operating. Showing different forms of value, other than cost savings, efficiencies, short term gains. Showing the values of supporting a healthy environment, thriving workers and citizens, for instance. Remembering to prioritise human values, and that not everything can be measured and optimised (or at least, not without great loss). this is not a quick journey - it is a slow one facing our countries as well as companies.
responsibility isn’t a free lunch – it usually takes work or sacrifice. Consider an organisation struggling between good user experience and viral effects their investors want to see to give highest growth. knowing what is right is not always obvious, and knowing what to do about it even less so.
It’s easier for organisations starting out to be responsible; their values, the people they recruit, the customers and investors they target are more aligned.
it’s very hard when your competitor may be moving fast and breaking things
i could make a case that capitalism has got us here - we live in a strongly capitalist society and there are some particularly insidious effects where network effects interact with risk capital and incentivise growth at all costs.
we should design for individual needs, collective needs and society’s needs. they won’t always be aligned.
think of uber - cheap and convenient for me; adding congestion to my city, so not so good overall.
We can imagine a future of equitable, collectively owned technologies, open source, fully accessible and useful, not dependent on exploited labour, energy efficient, well designed, solving important problems; no one making inequitable wealth off the back of others’s labour or data. but that is utopian. it is not likely any time soon. We can’t wait for the end of capitalism, if and when it comes.
I prefer to think about the changes we can make today.
we can choose to design organisations and digital products more thoughtfully. We can build social enterprises, co-operatives, and hybrid models including non-profits and trusts to make tech more equitable and more useful and better for everyone in society.
Some of us will be pioneers building radical new things in new ways. Others will change their existing organisations and products in smaller ways.
we need pioneers to try radical new ideas around responsibility - to show what might be possible
we need others to follow them, to build on the learnings of their experiments and create more robust technology which is both responsible and viable and sustainable.
and we need people who might do less radical things, but will enable the largest scale of change.
we can do better with even just with basic common sense, thinking through risks and planning sensibly. (If you are designing a lock, think about how malicious people could open it!)
Incidents of bad design affect the perception of tech as a whole. We need to do better at calling out silly mistakes early, and with a mix of critique and support, helping each other to build better products, learning together
Demonstrating the value of good design and engineering practice is worthwhile. developing and using standards will help - even more commercially-minded companies will follow industry standards. tools and processes to make it easier for those to whom this does not come naturally; and pioneers like ourselves constructively helping those who are still finding their way. making the case for externalities (in privacy, in the environment, in society) to be considered. championing good examples, and helping people make informed choices about tech which is right for them.
We need more of us making connected products responsibly, and showing that we can build successful businesses around them.
we’ll never get everyone being responsible, but we can move the dial
we are not alone. The Zebra movement - and I love their concept as pictured here - is less focussed on tech, but shares, I think, many similar values to us.
We can’t engineer people’s trust. That would be manipulative – and sometimes people are right not to trust some technologies.
We can engineer trustworthy digital systems, competently made, reliable, and honest about what they do and how they do it.
It is an individual responsibility on each developer, designer, leader.
so, what is the meaning of responsible technology? in ideation, design, development, deployment, operation, maintenance, and end of life?
it means a cultural shift in the technology world. away from moving fast and breaking things; away from heroes disrupting sectors; away from innovation for the sake of innovation. It means building useful products that help people and their communities. It means being part of the wider world; being humble; listening to and designing for real people and for society, with empathy.
it’s not all going to be perfect. My fairphone, here, is a great example of responsible tech - thinking about working conditions, environmental impact. And the fair phone team admit it’s not perfect. it’s not a fairphone. But it’s much better than the alternatives. when you see or own some responsible technology, big it up. tell your friends. most importantly, tell people who don’t work with technology! your friends and neighbours, your political representatives, companies who try to hire you. Show them what good looks like.
Some products and services will still fail. Some will have unexpected side effects. but we can do better than the big tech sector has done so far. and we can be seen to do better. We’re mostly in europe and we have what I would call european values. I don’t think enough people in europe know that europe is pioneering responsible tech and that there are alternatives to silicon valley tech.
we can show those who aren’t here that we exist; that tech can be different. that we are not all silicon valley. that responsible technology exists, that it’s wonderful. We should support each other, champion great examples. make responsible technology the new normal.
thank you