This document discusses regulating social media platforms and algorithms through interoperability requirements. It argues that self-regulation by technology companies is not sufficient and can permit issues like algorithmic discrimination. The document examines options for regulation, including:
1. Requiring dominant platforms to provide interoperability through open APIs to allow competitors access to user data and functionality.
2. Learning from regulations of other industries like telecoms that mandate interconnection.
3. Developing ethical standards and impact assessments for algorithms to address issues like bias, but notes these may not be suitable for all cases. Overall it argues a systematic approach is needed to ensure proper oversight and redress for users affected by automated systems.
Oxford Internet Institute 19 Sept 2019: Disinformation – Platform, publisher ...Chris Marsden
With the move to a more digital, mobile, and platform-dominated media environment people increasingly find and access news and information via platforms like search engines and social media. These have empowered citizens in many ways and are important drivers of attention to established publishers but have also enabled the distribution of disinformation from a range of different actors. In a context where citizens are often increasingly sceptical of both platforms, publishers, and public authorities, what do we know about the scale and scope of disinformation problems and what can different actors do to counter the problems we face?
https://www.scl.org/articles/10662-interoperability-an-answer-to-regulating-ai-and-social-media-platforms
Offdata: a prosumer law agency to govern big data in the public interestChris Marsden
Presentation to St Petersburg International Legal Forum 19 May 2016 Track Smart Society4.5. Information Security in the Digital Environment: Limits of Big Data Use http://regulatingcode.blogspot.co.uk/2016/05/offdata-prosumer-law-agency-to-govern.html
Oxford Internet Institute 19 Sept 2019: Disinformation – Platform, publisher ...Chris Marsden
With the move to a more digital, mobile, and platform-dominated media environment people increasingly find and access news and information via platforms like search engines and social media. These have empowered citizens in many ways and are important drivers of attention to established publishers but have also enabled the distribution of disinformation from a range of different actors. In a context where citizens are often increasingly sceptical of both platforms, publishers, and public authorities, what do we know about the scale and scope of disinformation problems and what can different actors do to counter the problems we face?
https://www.scl.org/articles/10662-interoperability-an-answer-to-regulating-ai-and-social-media-platforms
Offdata: a prosumer law agency to govern big data in the public interestChris Marsden
Presentation to St Petersburg International Legal Forum 19 May 2016 Track Smart Society4.5. Information Security in the Digital Environment: Limits of Big Data Use http://regulatingcode.blogspot.co.uk/2016/05/offdata-prosumer-law-agency-to-govern.html
My inaugural lecture. Three poarts : 1 . Are social neworking sites actually "anti-social"? 2. In depth discussion of the problems of SNSs and privacy, esp re recent Facebook privacy controversy since December 2010 changes. 3. Some ways to see SNSs as "pro-social" esp looking at democratic social media participation in the campaign against the Digital Economy Bill Nov 09-May 10.
SCL Marsden Introduction to Internet LawChris Marsden
Slides to accompany SCL podcast of 16 January 2014: http://www.scl.org/files/16_jan_2014/Chris_Marsden_-_SCL_Foundations_of_IT_Law_Module_2_-_Internet_Law.mp3
Input on threat images against information societySomerco Research
As a lobbyist at the European Parliament where I follow the ITRE committe I send draft proposals.
Abstract: More and more countries have taken the leap from being industrial societies to being information
societies. Societies are becoming increasingly dependent upon information technology, and thereby it
is important to reduce vulnerabilities in the information infrastructure and combat threats against such
an information society.
My inaugural lecture. Three poarts : 1 . Are social neworking sites actually "anti-social"? 2. In depth discussion of the problems of SNSs and privacy, esp re recent Facebook privacy controversy since December 2010 changes. 3. Some ways to see SNSs as "pro-social" esp looking at democratic social media participation in the campaign against the Digital Economy Bill Nov 09-May 10.
SCL Marsden Introduction to Internet LawChris Marsden
Slides to accompany SCL podcast of 16 January 2014: http://www.scl.org/files/16_jan_2014/Chris_Marsden_-_SCL_Foundations_of_IT_Law_Module_2_-_Internet_Law.mp3
Input on threat images against information societySomerco Research
As a lobbyist at the European Parliament where I follow the ITRE committe I send draft proposals.
Abstract: More and more countries have taken the leap from being industrial societies to being information
societies. Societies are becoming increasingly dependent upon information technology, and thereby it
is important to reduce vulnerabilities in the information infrastructure and combat threats against such
an information society.
The Artificial Intelligence World: Responding to Legal and Ethical IssuesRichard Austin
The presentation examines the legal and ethical issues that Facial Recognition Systems and Autonomous and Self-driving Vehicles present then looks at organizational, regulatory and individual tools available to respond to these issues.
(300-400 words)1- Watch anyone of the following documentarymovi.docxmayank272369
(300-400 words)
1- Watch anyone of the following documentary/movie:
· The Corporation (2005)
· Food Inc. (2009)
· An Inconvenient Truth (2006)
Share your understanding around
Who
THE PEOPLE INVOLVED
What
THE PROBLEMS, THINGS, IDEAS
When
PAST, PRESENT, FUTURE OF THE TOPIC
Where
THE PLACE INVOLVED
Why
THE CAUSES, REASONS, RESULTS, CONDITIONS.
How
HISTORY OR FUNCTION (HOW IT BEGAN OR OPERATES).
………………………………………………………………………………………………………………………………………….
2-
(a) Find a news article about an economic topic that you find interesting.
(b) Make a short bullet-list summary of the article.
(c) Write and illustrate with appropriate graphs an economic analysis of the key points in the article.
Hint: Use 5Ws and 1H in your explanation.
1. Who was involved?
1. What happened?
1. When did it happen?
1. Where did it happen?
1. Why did it happen?
1. How did it happen?
Smartphones Have Privacy Risks.docx
Smartphones Have Privacy Risks
Smartphones, 2013
Top of Form
Bottom of Form
Around the turn of the century, the FBI [Federal Bureau of Investigation] was pursuing a case against a suspect—rumored to be Las Vegas strip-club tycoon Michael Galardi, though documents in the case are still sealed—when it hit upon a novel surveillance strategy.
The suspect owned a luxury car equipped with an OnStar-like system that allowed customers to "phone home" to the manufacturer for roadside assistance. The system included an eavesdropping mode designed to help the police recover the vehicle if it was stolen, but the FBI realized this same antitheft capability could also be used to spy on the vehicle's owner.
When the bureau asked the manufacturer for help, however, the firm (whose identity is still secret) objected. They said switching on the device's microphone would render its other functions—such as the ability to contact emergency personnel in case of an accident—inoperable. A federal appeals court sided with the company; ruling the company could not be compelled to transform its product into a surveillance device if doing so would interfere with a product's primary functionality.
The specifics of that 2003 ruling seem quaint today [in 2012]. The smartphones most of us now carry in our pockets can easily be turned into surveillance and tracking devices without impairing their primary functions. And that's not the only privacy risk created as we shift to a mobile, cloud-based computing world. The cloud services we use to synchronize data between our devices increase the risk of our private data falling prey to snooping by the government, by private hackers, or by the cloud service provider itself. And we're packing ever more private data onto our mobile devices, which can create big headaches if we leave a cell phone in a taxicab.
What to do about it? In this [viewpoint], we'll explore the new privacy threats being created as the world shifts to an increasingly mobile, multi-device computing paradigm. Luckily, there are steps both device makers and lawmakers can take to ...
A Dynamic Intelligent Policies Analysis Mechanism for Personal Data Processin...Konstantinos Demertzis
The evolution of the Internet of Things is significantly a
ected by legal restrictions imposed for personal data handling, such as the European General Data Protection Regulation (GDPR).
The main purpose of this regulation is to provide people in the digital age greater control over their personal data, with their freely given, specific, informed and unambiguous consent to collect and process the data concerning them. ADVOCATE is an advanced framework that fully complies with the requirements of GDPR, which, with the extensive use of blockchain and artificial intelligence technologies, aims to provide an environment that will support users in maintaining control of their personal data in the IoT ecosystem. This paper proposes and presents the Intelligent Policies Analysis Mechanism (IPAM) of the ADVOCATE framework, which, in an intelligent and fully automated manner, can identify conflicting rules or consents of the user, which may lead to the collection of personal data that can be used for profiling. In order to clearly identify and implement IPAM, the problem of recording user data from smart entertainment devices using Fuzzy Cognitive Maps (FCMs) was simulated. FCMs are an intelligent decision-making system that simulates the processes of a complex system, modeling the correlation base, knowing the behavioral and balance specialists of the system. Respectively, identifying conflicting rules that can lead to a profile, training is done using Extreme Learning Machines (ELMs), which are highly ecient neural systems of small and flexible architecture that can work optimally in complex environments.
Intermediary Accountability in the Digital AgeRichard Austin
Examination of the accountability of Internet Intermediaries with a focus on Online Reputation, Cambridge Analytica and Facebook and Competition issues
https://digitalguardian.com/blog/social-engineering-attacks-common-techniques-how-prevent-attack
Statement of Michelle Richardson, Director, Privacy & Data
Center for Democracy & Technology
before the
United States Senate Committee on the Judiciary
GDPR & CCPA: Opt-ins, Consumer Control, and the Impact on Competition and Innovation
March 12, 2019
On behalf of the Center for Democracy & Technology (CDT), thank you for the
opportunity to testify about the importance of crafting a federal consumer privacy law that
provides meaningful protections for Americans and clarity for entities of all sizes and sectors.
CDT is a nonpartisan, nonprofit 501(c)(3) charitable organization dedicated to advancing the
rights of the individual in the digital world. CDT is committed to protecting privacy as a
fundamental human and civil right and as a necessity for securing other rights such as access to
justice, equal protection, and freedom of expression. CDT has offices in Washington, D.C., and
Brussels, and has a diverse funding portfolio from foundation grants, corporate donations, and
individual donations.1
The United States should be leading the way in protecting digital civil rights. This hearing
is an opportunity to learn how Congress can improve upon the privacy frameworks offered in
the European Union via the General Data Protection Regulation (GDPR) and the California
Consumer Privacy Act (CCPA) to craft a comprehensive privacy law that works for the U.S. Our
digital future should be one in which technology supports human rights and human dignity. This
future cannot be realized if people are forced to choose between protecting their personal
information and using the technologies and services that enhance our lives. This future depends
on clear and meaningful rules governing data processing; rules that do not simply provide
1 All donations over $1,000 are disclosed in our annual report and are available online at:
https://cdt.org/financials/.
2
people with notices and check boxes but actually protect them from privacy and security
abuses and data-driven discrimination; protections that cannot be signed away.
Congress should resist the narratives that innovative technologies and strong privacy
protections are fundamentally at odds, and that a privacy law would necessarily cement the
market dominance of a few large companies. Clear and focused privacy rules can help
companies of all sizes gain certainty with respect to appropriate and inappropriate uses of data.
Clear rules will also empower engineers and product managers to design for privacy on the
front end, rather than having to wait for a public privacy scandal to force the rollback of a
product or data practice.
We understand that drafting comprehensive privacy legislation is a complex endeavor.
Over the past year we have worked with partners in civil societ.
State of the art research on Convergence and Social Media A Compendium on R&D...Oles Kulchytskyy
The information is prepared by the team of the COMPACT project (http://compact-media.eu/).
COMPACT is a Coordination and Support Action funded European Commission under framework Horizon 2020.
The objective of the COMPACT project is to increase awareness (including scientific, political, cultural, legal, economic and technical areas) of the latest technological discoveries among key stakeholders in the context of social media and convergence. The project will offer analyses and road maps of related initiatives. In addition, extensive research on policies and regulatory frameworks in media and content will be developed.
Brief summary of how the law and legal practice may be affected by the ris of AI and autonomous cars, robots, etc - with a look at what harms or biases may result and how law and the market might try to solve those problems.
For many years, we have relied on a big, ALL CAPS waiver of liability in licenses and the ability of the recipient to examine and run the code to ensure software freedom for all. But the cloud, AI and now a wave of European regulation have eroded that dream. Where have we got to, and is software freedom still a viable objective?
Not fudging nudges: What Internet law can teach regulatory scholarshipChris Marsden
Paris GIGARTS Prosumer Law: Behavioural or ‘nudge’ regulation has become the flavour of the decade since Thaler and Sunstein’s eponymous monograph. The use of behavioural psychology insights to observe changes in regulated outcomes from the ‘bounded rational’ choices of consumers has been commonplace in Internet regulation since 1998, driven by co-regulatory interactions between governments, companies and users (or ‘prosumers’ as the European Commission terms us). Nudging was so familiar to Internet regulatory scholars in the late 1990s that it came to be termed the leading example of the ‘new Chicago School’ by Lessig (1998), recognising imperfect information, bounded rationality and thus less than optimal user responses to competition remedies, driven by insights from the Internet’s architecture and Microsoft’s dominance of computer platform architecture. Thus recent ‘nudge’ concerns by regulatory scholars and competition lawyers echo 1990s concerns by Internet regulation specialists. It is a mark of Internet regulation’s specialisation in Europe, and mainstream regulation and competition law’s failure to fully absorb the insights of that scholarship, that in 2016 the debate surrounding nudges and privacy affecting competition outcomes has yet to reinvent the 1990s wheel of nudge limitations. Learning their Internet regulatory history can help competition and regulation scholars not repeat the lessons of the 1990s Microsoft case. The competition and regulatory aspect of attempts to direct user and market behaviour are a key empirical perspective for regulatory scholars. The Internet is a network and a real-time laboratory for the distribution and manipulation of information, which is why it is unsurprising that the adaption of that information to affect user behaviour has been a commonplace online throughout the history of the Internet.
Similar to SCL Annual Conference 2019: Regulating social media platforms for interoperability (20)
QUT Regulating Disinformation with AI Marsden 2024Chris Marsden
“It is the ‘AI regulation moment” intoned the Secretary General of both the International Telecommunications Union (ITU) and the United Nations itself, before the UN General Assembly passed a unanimous resolution on AI safety, and the G7 Hiroshima Dialogue of AI codes of conduct moved industrialised nations beyond self-regulation. Academic analysts and policymakers need to challenge a reversion to broken models, to ethics washing and to what is now being termed ‘AI washing’. I set out a critical agenda for remembering lessons from the Internet past to assert an AI co-regulatory future.
Today, I will be presenting on the topic of
"Generative AI, responsible innovation, and the law."
Artificial Intelligence has been making rapid strides in recent years,
and its applications are becoming increasingly diverse.
Generative AI, in particular, has emerged as a promising area of innovation, the potential to create highly realistic and compelling outputs.
Marsden CELPU 2021 platform law co-regulationChris Marsden
12 November 2021 20th Annual International Conference, Center for Law & Public Utilities, School of Law, Seoul National University: The Wave of Digital Economy and Exploration of the Direction of Online Platform Regulation
Professor Chris Marsden, Sussex Law @SussCIGR
Discussion: Dr Eun-Jung Kwon (KISDI)
Prosumer Law and Networked Platform Regulation: The Long View
Platform regulation has become the cause of technology regulation: a call to regulate the intermediaries who provide platforms for networked digital services. These include the GAFA giants: Google, Amazon, Facebook and Apple. Many policy entrepreneurs are peddling solutions as the policy cycle turns, in a classic Kingdon case of ‘solutions chasing a problem’. Yet networks are not new, and their platforms have been regulated for hundreds of years. In this paper, I take the long view, focussing on common carriage neutrality and the railways/telegraphy regulation of the 1840s in England. I offer some historical examples that may be highly relevant to ‘prosumer’ digital capitalism 180 years later.
Any Internet user who has posted content, from Facebook to Twitter to blog posts to podcasts, has become a prosumer – though there are very broad categories, ranging from the occasional tweeter to the fully developed hacker. Over two billion people now use Google to search for content, Facebook, Instagram and WhatsApp to share news, gossip and photos, YouTube to watch and upload videos, and Twitter/Snap and other sites to say just about anything. We are all becoming ‘prosumers’ sharing intimate details of our personal lives. But this ‘prosumer environment’ is currently either grossly unregulated, leaving users' data and content at the mercy of the multinationals who host it and sometimes claim to own it, or subject to knee-jerk over-regulation as with the current ‘fake news’ controversy in Germany. It is a new regulatory policy cycle in network regulation.
Regulatory responses are finally emerging, driven by both data protection and competition concerns, yet the over-arching need to ensure greater neutrality of intermediaries has largely been limited to last mile monopolists and mobile oligopolists: the legacy telecommunications companies who provide Internet access. What is needed is a comprehensive Prosumer Law solution that draws on fundamental human rights to privacy and free expression, competition, and technology regulation to ensure a fair and neutral deal for prosumers and citizens.
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
Macroeconomics- Movie Location
This will be used as part of your Personal Professional Portfolio once graded.
Objective:
Prepare a presentation or a paper using research, basic comparative analysis, data organization and application of economic information. You will make an informed assessment of an economic climate outside of the United States to accomplish an entertainment industry objective.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Introduction to AI for Nonprofits with Tapp NetworkTechSoup
Dive into the world of AI! Experts Jon Hill and Tareq Monaur will guide you through AI's role in enhancing nonprofit websites and basic marketing strategies, making it easy to understand and apply.
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Thesis Statement for students diagnonsed withADHD.ppt
SCL Annual Conference 2019: Regulating social media platforms for interoperability
1. Regulating social media
platforms for interoperability
SCL Annual Conference
2 October 2019
Professor Chris Marsden
University of Sussex School of Law
5. Largely governed through self-regulation
Technology giants appear set to persuade us that
self-regulation remains the only effective route to
legal accountability for machine learning systems,
jeopardising the sustainable introduction of smart
contracts,
permitting algorithmic discrimination and
compromising the implementation of privacy law.
7. Discriminatory data is likely to lead
to discriminatory results
Discriminatory algorithms
as well as those not designed to filter out discrimination
can make those results more discriminatory
Justice requires that lawyers study algorithmic outcomes
in order to ascertain such discrimination,
which may be highly inefficient as well as
outrageous to natural justice and fundamental rights.
8. Public administration has generic
solutions
Administrative law
Natural justice –at least ‘reasonableness’
Right to explanation/remedy?
Discrimination law –
applies to corporate decisions
Specialist technology law
Biomedical/nanotech
Railways, roads, telecoms
Data Protection
11. Council of Europe: to err is human,
inducing AI complexity does not absolve
12. Caveat: regulation may not be suitable,
appropriate or feasible for many algorithms
But for those that regulators have most
concern about in
sectors that provide the most sensitive
socioeconomic decisions,
it is a remedy that can be explored.
13. Sensitive public facing sectors?
Banking/Credit, Insurance,
Medical Care & Research,
Social Care,
Policing and Security,
Education,
Transport
AI-piloted Airliners &
Autonomous Vehicles,
Social media
Telecommunications.
14. Transparency and replicability are
not the solutions to AI/ML problems
Transparency first requirement of legal recourse
(though some algorithms can be reverse engineered without
transparency “under the hood” of the machine).
It is not sufficient, however, for several reasons.
Claims that the ability to study an algorithm and its operation
provides a remedy for users who suffer as result of decisions.
15. Things change!
Both the training data and the algorithm itself will change constantly
e,g. impossible to forecast real time outcomes of Google searches
vast SEO business attempts approximations without complete accuracy
Remedy that can be achieved is only replicability –
taking an ‘old’ algorithm and its data at a previous point in time
to demonstrate whether algorithm and data became discriminatory.
Estimate just how incomplete a remedy by
allowing effectively ‘slow motion replays’
while the game is rushing onwards
17. AI regulation and 'ethics washing'
Undertaken by technology companies and their
professional advisors
to persuade policy makers that
self-regulation is the only effective route to legal
accountability for Machine Learning systems,
1. jeopardising the sustainable introduction of
smart contracts,
2. permitting algorithmic discrimination and
3. compromising implementation of data
protection law.
19. Ethics washing will fail
Cursory research into
history of communications regulation and
Internet law
demonstrates the falsity of this self-regulation proposition.
See:
Marsden, C. (2018) “Prosumer Law and Network Platform Regulation: The Long
View Towards Creating Offdata”, 2 Georgetown Tech. L.R. 2, pp.376-398;
Marsden, C. and T. Meyer (2019) Report for European Parliament: “The effects of
automated content recognition (ACR) technology-based disinformation initiatives on
freedom of expression and media pluralism”
20. Need for systematic redress
by external agency
Ben Wagner (2019) Liable, but Not in Control?
Ensuring Meaningful Human Agency in Automated Decision-Making
Systems, Policy & Internet, Vol. 11, No. 1, 2019, 104-122 at
https://onlinelibrary.wiley.com/doi/pdf/10.1002/poi3.198
Self-driving cars,
police searches using social media/PNR,
Facebook content moderation
22. What can and should be done?
1. Ethical standards for all AI deployed in ‘wild’ – to public
1. ISO standards being formed, basic privacy/human rights impact
assessment
2. No mandated interoperability for public communications providers
– Instant Messaging/Search/Social Media companies
3. APIs opened to dominant (SMP) operators
Based on Microsoft remedies in longest most expensive antitrust case in
EC history: case started in 1993 in US, EU 1998-2010
Google case started 2009 – ongoing a decade later
Commission decision of 27 June 2017 Case AT.39740 - Google Search (shopping)
23. 1. Ethical standards for all AI
deployed in ‘wild’ – to public
ISO standards being formed
1. Can be quite powerful influencers c.f. ISO27001 on cybersecurity
2. Typically technical engineering realm not normative standards
3. Embedded in national laws can become weak coregulatory signal
Basic privacy/human rights impact assessment
1. Proposed by UN Rapporteur Prof. David Kaye
2. Also see ‘Regulating Code’ (Brown/Marsden)
3. AI impact assessment suggested by European Data Protection
Supervisor
24. Standards still important!
Standards Australia chairing ISO Working Party:
ISO/IEC JTC 1/SC 42 Artificial intelligence
https://www.iso.org/committee/6794475.html
Australian Computer Society AI Ethics Committee:
https://www.acs.org.au/governance/ai-ethics-committee.html
Data61 (Australian Commonwealth Scientific and Industrial Research
Organisation (CSIRO):
Dawson D and Schleiger E*, Horton J, McLaughlin J, Robinson C∞, Quezada
G, Scowcroft J, and Hajkowicz S† (2019) Artificial Intelligence: Australia’s Ethics
Framework. Data61 CSIRO, https://data61.csiro.au/en/Our-Work/AI-Framework
Greenleaf, Graham and Clarke, Roger and Lindsay, David F., (2019)
Does AI Need Governance? The Potential Roles of a ‘Responsible Innovation
Organisation’ in Australia; Submission to the Human Rights Commissioner on
the White Paper Artificial Intelligence: Governance and Leadership
http://dx.doi.org/10.2139/ssrn.3346149
UK Information Commissioner’s Office, Feedback request — profiling and
automated decision-making, 6 April 2017,
https://ico.org.uk/media/about-the-ico/consultations/2013894/ico-feedback-
request-profiling-and-automated-decisionmaking.pdf
25. Interoperability as an algorithmic
regulatory remedy
Attempt to move beyond glances in the rear view mirror
Silicon Valley mantra is “move fast and break things”
To enforce access to dominant regulated company’s API
Application Programme Interfaces
Enables brokers, comparator programmes, regulators
to access algorithms in real time & controlled conditions
to observe the algorithm’s behaviour.
26. 2. Interoperability option for
public communications providers
Instant Messaging/Search/Social Media companies
1. Not so radical – required for broadcasters and telcos
1. Electronic Programme Guides
2. Telephone numbering schemes
3. NOT interconnection – up to smaller Ims to decide how to comply
4. Co-regulatory standards
2. Not as utilities but as media providers
1. This is NOT common carrier regulation
2. Not equivalent to energy/postal providers
3. Not as publishers but as printers
1. Arguments on fake news/hate speech for another time
2. Attempts to impose ‘Duty of Care’ fiduciary in UK/US are highly inappropriate
29. EU Commmissioner Vestager on
interoperability and large platforms
3 June speech: “Competition and the Digital Economy”
https://ec.europa.eu/commission/commissioners/2014-
2019/vestager/announcements/competition-and-digital-economy_en
“Making sure that products made by one company will work
properly with those made by others –
can be vital to keep markets open for competition.”
Microsoft’s takeover of LinkedIn approval depended on
agreement to keep Office working properly,
not just with LinkedIn,
but also with other professional social networks.
“Commission will need to keep a close eye on strategies that
undermine interoperability”
31. 3. Dominant (SMP) operators
API opened
If dominant –competition and consumer remedy
1. ACCC find dominance by Facebook & Google
2. Only applies to platform aspects of their business
1. i.e. iTunes not Apple phones
Microsoft remedies in longest most expensive
antitrust case in EC history - $5billion fines
1. Case started in 1993 in US, EU 1998-2014
1. Google case started 2009 – ongoing a decade later
32. Note this is not about the
advertising market (only a proxy)
33. Three models – proposed by
Brown/Marsden 2008, 2013
Model 1: Must-carry obligations
broadcasters & Electronic Programme Guides
Model 2: API disclosure requirements
Microsoft from EC rulings
Case T-201/04, Microsoft v Commission, EU:T:2007:289, 1088
Decision 24 May 2004 Case C-3/37792 Microsoft; Decision of
16 December 2009 in Case 39530 Microsoft (Tying)
Model 3: Interconnect requirements
Applied to telcos, especially with SMP
34. Interoperability? 3 Types
Protocol interoperability
ability of services/products to interconnect technically
usual interoperability in competition policy
Data interoperability
Recalling Mayer-Schonberger/Cukier
Slice of data to competitors
Full protocol interoperability
What telecoms often thinks of as full
interconnection
35. Why interoperate?
It’s the economics!
Mechanism for achieving any-to-any connectivity –
promotes innovation
There is nothing less valuable than a network with one user!
Interoperability results in increased value of networks
promotes efficient investment in/use of infrastructure
Essential for new entrants to compete with existing
operators on non-discriminatory basis promotes entry
36. Is this remedy more broadly applicable?
Banking/insurance/medical algorithmic ‘AI’?
Self-driving vehicles?
Depends on a variety of socio-economic factors
Many sectors have regulators working on
‘regulatory sandpit’ solutions
Interoperability extensively used in sectors with
which we are most familiar
37. Consumer Data Right?
Oz CDR to deliver open banking, open energy and open telecoms?
Many Europeans – well, we few –very excited about CDR model
UK Furman Review of Digital Markets: ‘data mobility’
Competition and Markets Authority: Data, Technology & Analytics unit
Innovation and Intelligence team: audit algorithms & research tech markets
39. Christopher Kuner, Fred H. Cate, Orla Lynskey, Christopher Millard, Nora Ni Loideain,
and Dan Jerker B. Svantesson, ‘Expanding the artificial intelligence-data protection
debate’ (2018) 8 (4) International Data Privacy Law, 289
Sandra Wachter, Brent Mittelstadt and Luciano Floridi, ‘Why a Right to Explanation of
Automated Decision-Making Does Not Exist in the General Data Protection Regulation’
(2017) 7 (2) International Data Privacy Law 76;
Sandra Wachter, Brent Mittelstadt, Chris Russell, ‘Counterfactual Explanations without
Opening the Black Box: Automated Decisions and the GDPR’ (2018) HarvardJL&Tech 1
Andrew D. Selbst and Julia Powles, ‘Meaningful information and the right to
explanation’ (2017) 7 (4) International Data Privacy Law 233.
Lilian Edwards, Michael Veale, ‘Slave to the algorithm? Why a ’right to an explanation’
is probably not the remedy you are looking for’ (2017) 16 (1) Duke Law & Technology
Review 18;
Lilian Edwards, Michael Veale, ‘Enslaving the Algorithm: From a "Right to an
Explanation" to a "Right to Better Decisions”?’ (2018) 16 (3) IEEE Security & Privacy 46
Lilian Edwards, Michael Veale, ‘Clarity, surprises, and further questions in the Article 29
Working Party draft guidance on automated decision-making and profiling’ (2018) 34 (2)
Computer Law & Security Review 398
40. 10 Steps towards Ethical AI
1. Transparency
Geeks love this, it’s almost meaningless to average user
2. Explainability
See above –more useful is replicability
3. Consent
See GDPR on meaningful & ‘course of business’
4. Discrimination
Garbage in/Garbage out
5. Accountability to Stakeholders
6. Portability
Australia’s Consumer Data Right!
7. Redress and Appeal
8. Algorithmic Literacy
See ‘how to programme your VCR’
9. Independent oversight
10. Governance
Hosanagar advocates for the creation of an independent Algorithmic
Safety Board, modeled on the Federal Reserve Board
https://www.vox.com/the-highlight/2019/5/22/18273284/ai-algorithmic-bill-of-
rights-accountability-transparency-consent-bias