Whitt a deference to protocol revised journal draft december 2012 120612
Upcoming SlideShare
Loading in...5
×
 

Whitt a deference to protocol revised journal draft december 2012 120612

on

  • 566 views

 

Statistics

Views

Total Views
566
Views on SlideShare
566
Embed Views
0

Actions

Likes
0
Downloads
2
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft Word

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Whitt a deference to protocol revised journal draft december 2012 120612 Whitt a deference to protocol revised journal draft december 2012 120612 Document Transcript

  • WORKING DRAFT December 6, 2012 A Deference to Protocol: Fashioning A Three-Dimensional Public Policy Framework for the Internet Age Richard S. Whitt* “Truth happens to an idea.” William James AbstractThis paper discusses how public policy grounded in the Internet’s architecture can bestensure that the Net fully enables tangible benefits such as innovation, economic growth,free expression, and user empowerment. In particular, recognizing that the Internet israpidly becoming society’s chief operating system, this paper shows how an overarchingpublic policy framework should be faithful to the multifaceted nature of the online world.As part of such a framework, this paper will explore one key aspect of the Internet: the“logical” Middle Layers functions, its inner workings derived from open softwareprotocols. Adhering to the deferential principle of “respect the functional integrity of theInternet,” in combination with the appropriate institutional and organizationalimplements, can help ensure that any potential regulation of Internet-based activitiesenables, rather than hinders, tangible and intangible benefits for end users.I. IntroductionNo Nerds AllowedIn late 2011, the United States Congress was heavily involved in debating legislationaimed at stopping foreign websites from hosting content that violates U.S. copyrightlaws. The House bill, called SOPA (―Stop Online Piracy Act‖), and the Senate bill,known as PIPA (―Protect Intellectual Property Act‖), shared a common element: theysought to impose certain technical requirements on website owners, search engines, ISPs,and other entities, intended to block the online dissemination of unlawful content.On November 16, 2011, the House Judiciary Committee held a public hearing on SOPA.Representatives from the affected content industries were on hand to testify in favor ofthe legislation. The sole voice in opposition was Katie Oyama, copyright counsel forGoogle. There were no other witnesses called to provide testimony – including, notably,no technical experts on the actual workings of the Internet. No software engineers, no* Rick currently is global head for public policy and government relations at Motorola Mobility LLC, awholly-owned subsidiary of Google Inc. Previously he spent over five years in Google‘s public policyshop in Washington, DC. Rick wishes to thank in particular Max Senges, his colleague in Google‘s Berlinoffice, for his unfailing enthusiasm and insights about polycentric governance and other pertinent matters.Vint Cerf and Brett Frischmann also provided some helpful detailed feedback on an earlier version of thepaper.
  • hardware engineers, no technologists, no scientists, no economists, no historians.This apparent oversight was surprising, because many prominent Internet engineers hadbeen making known their concerns about the legislation. In May 2011, for example,various network engineering experts produced a straightforward technical assessment ofthe many infirmities of the legislation.1 The engineers took no issue with the goal oflawfully removing infringing content from Internet hosts with suitable due process.2Instead, they pointed out how the proposed means of filtering of the Internet‘s DomainName System (DNS) would be easily circumvented.3 They also argued that collateraldamage to innocent online activities would result from such circumvention techniques, aswell from the act of DNS filtering itself.4 These experts readily identified both the under-inclusion and over-inclusion risks from the pending legislation.5Other engineering experts, including Steve Crocker, Paul Vixie, Esther Dyson, DaveFarber, and many others, expressed similar concerns in a short letter they submitted toCongress just following the November hearing.6 The engineers all agreed that the bill‘sproposed restrictions on DNS and other functional elements of the Internet would utilizethe highly problematic targeting of basic networking functions. The bills in particularwould interfere with the Internet‘s naming and routing systems, in a way that would beboth ineffective (because many technical work-arounds are possible) and overinclusive(because many legitimate uses and users would be adversely affected). 7 In other words,Congress was considering legislation that, while laudable in its overall objective, wasaimed at the wrong functional target.In their arguments, the engineers relied on the design principles embedded in theInternet‘s architecture. ―When we designed the Internet the first time,‖ they explained,―our priorities were reliability, robustness, and minimizing central points of failure orcontrol. We are alarmed that Congress is so close to mandating censorship-complianceas a design requirement for new Internet innovations.‖8Little evidence of these views actually made its way into the November 16th hearing, andno expert testimony about the inner workings of the Internet was heard. At a subsequentmarkup of the House bill, members debated whether such knowledge was evennecessary. For example, Congressman Mel Watt (D-CA), ranking member of thesubcommittee that governs Internet policy, exclaimed that ―I don‘t know about the1 Steve Crocker, David Dagon, Dan Kaminsky, Danny McPherson, and Paul Vixie, Security and OtherTechnical Concerns Raised by the DNS Filtering Requirements in the Protect IP Bill, May 2011.2 Security and Other Technical Concerns, at 3.3 Id. at 7-10.4 Id. at 10-13.5 See also Allan A. Friedman, Cybersecurity in the Balance: Weighing the Risks of the PROTECT IP Actand the Stop Online Piracy Act, The Brookings Institution, November 15, 2011 (Legislation‘s attempt to―execute policy through the Internet architecture‖ creates real threats to cybersecurity, by both harminglegitimate security measures (over-inclusive) and missing many potential work-arounds (under-inclusive).6 Vint Cerf, Paul Vixie, Tony Li et al., An Open Letter from Internet Engineers to the United StatesCongress, December 15, 2011.7 An Open Letter, at 1.8 An Open Letter, at 1. 2
  • technical stuff, but I don‘t believe the experts.‖ He added, ―I‘m not a nerd.‖Congresswoman Maxine Waters (D-CA) complained that those who raised questionsabout the bill‘s impact on the Internet were ―wasting time.‖ A few voices were raised onthe other side.9 Congressman Jason Chaffetz (R-Utah) said that ―We are doing surgeryon the Internet without understanding what we are doing…. If you don‘t know whatDNSSEC is, we don‘t know what we‘re doing.‖ As he put it, ―Maybe we oughta asksome nerds what this thing really does.‖Nonetheless the political skids appeared greased. After a markup on December 15th, itwas becoming increasingly apparent to the pundits that this legislation was going to pass,and likely be signed into law, albeit reluctantly, by President Obama.10 Except it did nothappen. On January 18, 2012, a host of Internet companies participated in what becameknown as the ―Internet Blackout Day,‖ seeking to enlist heir users to protest against thebills. Over 115,000 websites committed what was the Web‘s version of a collective workstoppage, with some like Wikipedia closing access to their content while others likeGoogle posting messages exhorting users to voice their concerns to Congress.Lawmakers received emails from some 14 million people.11 The response was swift, andpredictable: the legislation would not be brought to a floor vote in either chamber.The interesting question is where things go from here. Despite the best efforts of theInternet community to explain and educate, their voices were taken seriously only inwake of the Internet Blackout Day, leaving politicians scrambling to publicly back awayfrom bills previously endorsed. It is fair to say that many in Congress still do not have aninformed appreciation for the structural and functional integrity of the Internet. Instead,the debate turned into a classic political battle, won only by unconventional butstraightforward lobbying tactics, rather than the power of legitimate ideas.What would John Kingdon say about this dilemma?Accepting the Kingdon ChallengeThis paper complements and extends the analysis proposed in four previous papers by theauthor,12 as well as a seminal paper by Professor Lawrence Solum calling onpolicymakers not to disrupt the integrity of the different protocol layers of the Internet.139 Steve King tweeted that ―We are debating the Stop Online Piracy Act and Shiela [sp] Jackson has sobored me that I‘m killing time by surfing the Internet.‖10 On January 17th, the White House issued a statement that the President would not support legislation that―reduces freedom of expression, increases cybersecurity risks, or undermines the dynamic, innovativeglobal internet.‖ It is not clear whether this was a legitimate veto threat, or something short of one.11 Wikipedia, ―Protests against SOPA and PIPA,‖ last visited July 22, 2012.12 Richard S. Whitt & Stephen Schultze, The New “Emergence Economics” of Innovation and Growth, andWhat It Means for Communications Policy, 7 J. TELECOMM. & HIGH TECH. 217 (2009) (―EmergenceEconomics‖); Richard S. Whitt, Adaptive Policymaking: Evolving and Applying Emergent Solutions forU.S. Communications Policy, 61 FED. COMM. L.J. 483 (2009) (―Adaptive Policymaking‖); Richard S.Whitt, Evolving Broadband Policy: Taking Adaptive Stances to Foster Optimal Internet Platforms, 17COMMLAW CONSPECTUS 417 (2009) (―Broadband Policy‖); Richard S. Whitt, A Horizontal Leap Forward:Formulating a New Communications Public Policy Framework Based on the Network Layers Model, 56FED. COMM. L.J. 587 (2004) (―Horizontal Leap‖).13 Lawrence B. Solum & Minn Chung, The Layers Principle: Internet Architecture and the Law (University 3
  • These papers share a certain perspective about the desirability of creating an overarchingconceptual framework for the Internet that helps us explore and craft policy solutions thatwork with, and not against, its generative nature. But the chief message here isstraightforward: policymakers should do what they can to understand and take advantageof the Internet‘s structural and functional design.This paper focuses on discovering an optimal fit between the means and ends of proposedregulation of the Internet‘s inner workings – what I call its ―Middle Layers‖ -- so as tobest preserve the integrity of its basic design. That policy which works best in the marketis best fit to the technology, institutions, and organizations that are involved, and notmerely the most convenient politically. Fortunately, there are a number of ways thatpublic policy can be applied to ―the Internet‖ without doing violence to its workinginternal structure. The key is to match the right policy instruments to the right functionalsolution. As Solum points out, for Internet-related issues ―public policy choices can begrounded not in vague theoretical abstractions, but in the ways that communicationsnetworks actually are designed, constructed, and operated.‖14 In that way, we would be―drawing public policy lessons from Internet topology and experience.‖15In my previous work, I have argued that traditional economic theories have been rootedin basic misconceptions about micro and macro human behavior. Modern politicaltheory suffers accordingly from the same drawbacks: persistent misunderstandings abouthow ordinary people think and operate, particularly in pervasive networked environmentslike the Internet. In their place, I respectfully suggested some new ways of seeing andframing the relevant issues related to online markets (Emergence Economics) andcommunications technology (Adaptive Policymaking). In Broadband Policy I applied themarket and technology framings to broadband transmission networks, offering concreteexamples of novel policy options. We now face a similar crisis when it comes to thething, the process, the system, we call the Internet.Political scientist John Kingdon famously asserted that in the political sphere often it isideas, not pressure, that matters most at the end of the day.16 While power, influence, andstrategy are important, ―the content of the ideas themselves, far from being meresmokescreens, or rationalizations, are integral parts of decision making in and aroundgovernment.‖17 Keynes agreed that in politics ―the power of vested interests is vastlyexaggerated compared with the gradual encroachment of ideas.‖18And yet, the still-fresh SOPA/PIPA legislative battle sorely tests that optimistic thesis,and crystallizes the challenge. After all, various concerted educational efforts wereof San Diego School of Law, Public Law and Legal Theory Research Paper No. 55) (2003), available athttp://ssrn.com/abstract=416263 (Solum, Layers Principle).14 Solum & Chung, Layers Principle, at 595.15 Solum & Chung, Layers Principle, at 614.16 JOHN W. KINGDON, AGENDAS, ALTERNATIVES, AND PUBLIC POLICIES (Updated 2 nd edition2011), at 125.17 KINGDON, AGENDAS, at 125.18 JOHN MAYNARD KEYNES, THE GENERAL THEORY OF EMPLOYMENT, INTEREST, ANDMONEY (1936), at 383. 4
  • launched, and failed. In some respects, this has become a battle of ideas about the properrole of the Internet in society. As a purely intellectual mater, the idea of not imposingcertain mandates on the Internet‘s design architecture was not accepted. Or rather, thecounter idea was winning out that the Internet as it currently exists must be ―fixed‖because, allegedly, it facilitates massive theft of content.As it turned out, politics as usual (with a unique Internet-fueled twist) won the moment,with the Internet Blackout Day forcing a hasty retreat even by staunch bill sponsors.Frankly this is not a desirable outcome, one driven by confusion and fear, rather thanunderstanding. Such shows of political force are usually difficult to replicate,complicated to harness, and can quickly lose their novelty and impact. Moreover, whilemost impressive and effective, at least for the moment, the show of force convincedpolitically, without convincing intellectually.For all practical purposes, the Net is becoming the chief operating system for society.And yet, confusion and misapprehension about how the Net functions – its basic designattributes and architecture– remains frighteningly high, even in policymaking circles.Ironically, the collective ignorance of our body politic is slowly strangling that which weshould want most to preserve. Perhaps the Net community only has itself to blame forthis predicament. For too long we urged policymakers simply to look the other waywhenever talk about Internet regulation surfaced. After all, many of us simply laughedwhen then-Senator Ted Stevens railed about the Net as a ―series of tubes,‖ or then-President George W. Bush referred to ―the internets.‖ We were convinced that ignoranceabout the Internet – just a big mysterious, amorphous cloud, right? – would lead ourpoliticians to shy away from imposing law and rules.One can make a case that the lack of understanding was willful, that the fault lies not withthe Net‘s partisans but with the politicals who chose blindness over vision. And clearlyquite serious efforts had been undertaken, often in vain, to educate policymakers aboutthe potential errors of their ways.19 On the other hand, for some entities hidingcommercial dealings behind the rhetorical cloak of Internet ―unregulation‖ helped wardoff unwanted government scrutiny. Deliberate obfuscation protected pecuniary interests.Regardless of motivations on both sides, however, the larger point is what is crucial: thedays of easy sloganeering are over. It is time for the Internet community to come outfrom behind the curtain and explain itself. This paper is intended as a modestcontribution to that end.To be clear at the outset, this piece is not going to argue for a form of what some havetermed ―Internet exceptionalism.‖ The rallying cry of ―Don‘t regulate the Internet‖ nolonger makes sense any longer, at least as commonly understood. The Internet and all themyriad activities it facilitates will be regulated, to some degree, by someone. 20 The chief19 While there have been such educational efforts in the past, too often their voices have failed to reach theears of policymakers. See, e.g., Doc Searls and David Weinberge, ―World of Ends, What the Internet Is,and How to Stop Mistaking It for Something Else‖ (March 10, 2003), found at wwww.worldofends.net.20 As de La Chappelle puts it, ―The Internet is far from being unregulated; numerous national laws directlyor indirectly impact human activities on the Internet, whether we like it or not.‖ Bernard de La Chappelle, 5
  • question is, how? My paper will attempt to explain that what we really need is a newform of ―Internet contextualism,‖ where the basic workings of the Net are understood andfully accounted for as we wrestle with difficult questions about social concerns. Underthis banner, government involvement – directly or indirectly, through a variety ofinstitutional and organizational vehicles – will happen only for the right reasons, andaimed in the right way at the pertinent uses and abuses of the network.The philosophical pragmatists will observe that it is not enough for an idea to be true; forpurposes of public acceptance, it must be seen to be true.21 Assuring politicians that it isacceptable not to regulate what they don‘t comprehend just won‘t fly anymore. InKingdonian terms, we need to couple the three policy streams of recognizing problems,formulating proposals, and connecting to politics.22 We cannot afford to ignore ordownplay any of those three elements in a policy framework that actually works. Norshould we invest in slogans whose time has come and gone. But as we shall see, a newslogan may now be appropriate. A more modest and grounded exhortation, to ―respectthe functional integrity of the Internet.‖ An idea whose veracity and validity in the mindsof too many policymakers is still very much in doubt.II. The Internet’s Fundamental Design Features A. The Net’s Framing YearsA technology is not easily severable from the culture in which it is embedded.23 It is atruism that the Internet was born and raised not from the market, but from an unlikelyconfluence of government and academic forces. Literally hundreds of people contributedto what eventually became the ―Internet project‖ over several decades of development,from designers and implementers to writers and critics. The participants came fromuniversities, research laboratories, government agencies, and corporations.24 What manyof them worked on were the technical standards that would provide the essential buildingblocks for the various online technologies to follow.25There is little doubt that the Internet ―represents one of the most successful examples ofsustained investment and commitment to research and development in informationinfrastructure.‖26 A brief overview of the Net‘s roots, processes, and people will shedsome light on how it actually operates.Multistakeholder Governance, Principles and Challenges of an Innovative Political Paradigm, MINDCo:Llaboratory Discussion Paper Series No. 1, #2 Internet Policy Making, September 2011, at 16.21 ―The truth of an idea is not a stagnant property inherent in it. Truth happens to an idea. It becomes true, ismade true by events. Its verity is in fact an event, a process, the process namely of its verifying itself, itsverification. Its validity is the process of its validation.‖ WILLIAM JAMES, THE MEANING OF TRUTH,(18__), at __. See also Whitt, Adaptive Policymaking, at __.22 KINGDON, AGENDAS, at __.23 Whitt and Schultze, Emergence Economics, at 251.24 David D. Clark, The Design Philosophy of the DARPA Internet Protocols, ACM (1998), 106, 114.25 Center for Democracy and Technology, ―The Importance of Voluntary Technical Standards for theInternet and its Users,‖ August 29, 2012, at 3.26 Barry M. Leiner, Vinton G. Cerf, David D. Clark, et al. The Past and Future History of the Internet,Communications of the ACM, Vol, 40, No. 2 (February 1997), at 102. 6
  • 1. From top-down government management to bottom-up guidanceThe Internet was actually born from several different projects in the late 1960s and1970s, all which were funded and controlled in some manner by national governmentagencies. However, the early homogeneity of design and top-down control slowly gaveway over time to a heterogeneity of design and bottom-up governance. In some sense,the nature of process followed the nature of function.In 1968 the U.S. Department of Defense‘s Advanced Research Projects Agency (ARPA)awarded to contractor BBN the first government grant to construct and developARPANET. This single network was intended to allow dissimilar computers operating atdifferent sites to share online resources. ARPANET eventually became ARPA‘s host-to-host communications system.27 One key feature was the Interface Message Processes(IMPs), the network protocols, accessible only by project engineers. ARPA provideddirect management and control over this project, alongside introduction of the NetworkWorking Group (NWG) in 1969 and the ARPA Internet Experiment group in 1973.Vint Cerf and Bob Kahn would do much of their work on the TCP/IP protocol suite underthe auspices and funding of ARPA, starting in 1968 and leading to their landmark 1974paper.28 As opposed to addressing how to communicate within the same network, Cerfand Kahn tackled a far more challenging problem: linking together disparate packet-switching networks with a common set of protocols. The Cerf/Kahn paper developedTCP as the means of sharing resources that exist in different data networks (the paperfocused on TCP, but IP later was split out to logically separate router addressing fromhost packet sending).29 TCP/IP was adopted as a Defense Department standard in 1980,and incorporated within ARPANET in 1983. However this work helped serve as acrucial bridge to the next phase in the development of what now is known as the Internet.Moving beyond ARPANET, the top-level goal for the new Internet project was todevelop ―an effective technique for multiplexed utilization of existing interconnectednetworks.‖30 The National Science Foundation (NSF) and others recognized TCP/IP asthe primary means of solving that difficult task, and the protocol suite was incorporatedinto its NSFNET research network in 1985. NSF and other government agencies were incontrol of this particular networking project, and access remained strictly limited toacademic institutions and the U.S. military. Nonetheless starting in the late 1970s otherbodies began to appear to help steer the course of this growing new ―network ofnetworks,‖ including the Internet Configuration Control Board (founded by Vint Cerf in1979), the International Control Board, and the Internet Activities Board. The InternetEngineering Task Force (IETF) was launched in 1986.27 All told, Congress spent some $25 million on ARPANET – a fine return on investment.28 Vinton G. Cerf and Robert E. Kahn, ―A Protocol for Packet Network Intercommunication,‖ IEEE Trans.On Comms, Vol. 22, No. 5, May 1974.29 In essence, TCP creates and organizes data packets, while IP wraps a header with routing instructionsaround each packet. UDP was another host-to-host protocol developed in this same timeframe.30 Clark, Design Philosophy, at 106. 7
  • Commercial services were authorized on ―the Internet‖ beginning in 1989, and with itcame a plethora of new bodies involved in some element of Internet governance orstandards. The Internet Society (ISOC) arrived in 1992, along with the InternetArchitecture Board (IAB), which replaced ICCB. The World Wide Web Consortium(W3C) followed two years later, and the Internet Corporation for Assigned Names andNumbers (ICANN) made its appearance in 1998. As government control and fundingdeclined, commercial and non-commercial entities alike stepped into the breach to guidethe Internet‘s continuing evolution. 2. From government roots to market realityAmazingly, some would contest this long-settled version of the Net‘s history. GordonCrovitz, former publisher of the Wall Street Journal, proclaimed in a recent opinion piecethat ―It‘s an urban legend that the government launched the Internet.‖31 The reaction tothis piece was swift, and appropriately merciless.32 The larger lessons should not bemissed, however. First, it is troubling that a major publication would publish suchnonsense, and assumes there would be a ready audience for it. As much as there isignorance about the basic workings of the Internet, apparently there is willful effort toconfuse the public about how it even began. Second, and more fundamentally, election-year posturing should not be allowed to obscure the simple fact that the Internet is adifferent animal because of its origins. The Net‘s design derives directly from its birth asan unusual confluence of government, academic, and libertarian culture, which onlygradually gave way to commercialization.Indeed, it is a fascinating question whether the Internet would have developed on its ownas a purely commercial creature of the marketplace. Networking pioneer andentrepreneur Charles Ferguson for one says no. He argues that many new technologieslike the Internet typically come not from the free market or the venture capital industry;rather, ―virtually all the critical technologies in the Internet and Web revolution weredeveloped between 1967 and 1993 by government research agencies and/or inuniversities.‖33Steve Crocker, an original technical pioneer of the Internet, shares that view. He pointsout that the Internet never could have been created without government‘s assistance as31 Gordon Crovitz, ―Who Really Invented the Internet?‖ Wall Street Journal, July 22, 2012. Instead, heclaims, Xerox should get ―full credit‖ for such an invention. His motivation for holding and explicatingsuch a view, apparently, is that the Net‘s history is ―too often wrongly cited to justify big government.‖ Id.32 See, e.g., Farhad Manjoo, ―Obama Was Right: The Government Invented the Internet,‘ Slate, July 24,2012 (―Crovitz‘ yarn is almost hysterically false…. A ridiculously partisan thing.‖); Harry McCracken,―How Government (and Did Not) Invent the Internet,‖ Time Techland, July 25, 2012 (Crovitz‘ argument ―is bizarrely, definitively false.‖). Vint Cerf also provided a pointed response. Seehttp://news.cnet.com/8301-1023_3-57479781-93/no-credit-for-uncle-sam-in-creating-net-vint-cerf-disagrees/ (―I would happily fertilize my tomatoes with Crovitz‘ assertion.‖).33 CHARLES H. FERGUSON, HIGH STAKES, NO PRISONERS: A WINNER‘S TALE OF GREEDAND GLORY IN THE INTERNET WARS (1999), at 13. 8
  • funder and convener.34 In particular, the Internet‘s open architecture was a fundamentalprinciple that was a hallmark of the government research effort, one that would not havecome about if the Net had been created instead by private industry. 35 Indeed, without theexistence of a ready alternative like the Internet, the relatively ―closed‖ online networksmay well have remained the prevailing marketplace norm.36Regardless, given its distinctive origins at the ―unlikely intersection of big science,military research, and libertarian culture,‖37 it is not surprising that the players, processes,and guiding philosophies pertinent to how the Net was designed and operated are ratherunique. This may well mean that we need some unconventional tools to fully assess theNet, as a technological and social phenomenon not originating from the market. 3. Rough consensus and running codeWith the Net‘s roots stretching at least as far back as 1962, ―the initial impulse andfunding for the Internet came from the government military sector,‖ with members of theacademic community enjoying great freedom as they helped create the network ofnetworks.38 According to Hofmokl, that freedom remained as ―a major source of pathdependency,‖ as shown in the early shaping principles and operating rules of the Net: thelack of a central command unit (with consensus-driven, democratic processes to defineoperations), the principle of network neutrality (a simple network with intelligenceresiding at the end points), and an open access principle (local networks joining theemerging global Internet structure).39The Net‘s design criteria were conceptualized during its first decade in powerfully path-dependent ways that have been foundational for the treatment of legal and policy issuesthen and now – what Braman calls ―the framing years.‖40 Key to the design criteria aretechnical standards, the language that computers, phones, software, and networkequipment use to talk to each other.41 Protocols became widely recognized technicalagreements among computers and other devices about how data moves between physicalnetworks.42 Internet pioneer Steve Crocker states that a ―culture of open processes‖ ledto the development of standards and protocols that became building blocks for the Net. 43Informal rules became the pillars of Internet culture, including a loose set of values and34 Steve Crocker, ―Where Did the Internet Really Come From?‖ techpresident.com, August 3, 2012.35 Crocker, ―Where Did the Internet Really Come From?‖36 See Whitt and Schultze, Emergence Economics, at 254.37 MANUEL CASTELLS, THE INTERNET GALAXY 17 (2001).38 Hofmokl, at 230.39 Hofmokl, at 230.40 Braman, The Framing Years, at 3.41 CDT, ―Voluntary Technical Standards,‖ at 1.42 SEARLS, INTENTION ECONOMY, at Chapter 9. In 1969 the Network Working Group adopted theword ―protocol‖ (then in widespread use in the medical and political fields to mean ―agreed procedures‖) todenote the set of rules created to enable communications via ARPANET. Interestingly, the Greek root―protokollon‖ refers to a bit of papyrus affixed to the beginning of a scroll to describe its contents -- muchlike the header of a data packet. Whitt, A Horizontal Leap, at 601-02.43 Whitt, Broadband Policy, at 507 n.533. 9
  • norms shared by group members.44 The resulting broader global vision of both processand rules ―overshadowed the orientation that initially had been pursued by thegovernment agencies focused on building specific military applications.‖45Unconventional entities accompany these informal rules. Today there is no singlegoverning body or process that directs the development of the Internet‘s protocols.46Instead, we have multiple bodies and processes of consensus. Much of the ―governance‖of the Internet is carried out by so-called multistakeholder organizations (MSOs), such asISOC, W3C, and ICANN. Over the last two decades, although these entities have largelyestablished the relevant norms and standards for the global Internet, ―they are littleknown to the general public, and even to most regulators and legislators.‖47ISOC is the first and one of the most important MSO, with a stated mission "to assure theopen development, evolution and use of the Internet for the benefit of all peoplethroughout the world."48 Since 1992 engineers, users, and the companies that assembleand run the Internet debate at ISOC about what particular course the Net should take.The Internet Engineering Task Force (IETF) now operates under the auspices of ISOC,and its stated goal is ―to make the Internet work better.‖49 It grew out of the InternetActivities Board, and previously the Internet Configuration Control Board. The IETF isthe institution that has developed the core networking protocols for the Internet, includingIPv4, IPv6, TCP, UDP, and countless others.50 The body is open to any interestedindividual, meets three times a year, and conducts activities through working groups invarious technical areas. Its standards-setting process includes electronic publishing andbroad distribution of proposed standards.The IETF has articulated its own cardinal principles for operation. The body employs anopen process (where any interested person can participate, know what is being decided,and be heard), relies on technical competence (where input and output is limited to areasof technical competence, or ―engineering quality‖), has a volunteer core of leaders andparticipants, utilizes ―rough consensus and running code‖ (standards are derived from acombination of engineering judgment and real-world experience), and acceptsresponsibility for all aspects of any protocol for which it takes ownership.51 An earlydocument states that IETF should act as a trustee for the public good, with a requirementthat all groups be treated equitably, and an express recognition of the role forstakeholders.52 Some have argued that this statement alone created ―the underpinning of44 Hofmokl, at 230.45 Hofmokl, at 231.46 Bernbom, Analyzing the Internet as a CPR, at 13.47 Joe Waz and Phil Weiser, Internet Governance: The Role of Multistakeholder Organizations, SiliconFlatirons Roundtable Series on Entrepreneurship, Innovation, and Public Policy (2012), at 1.48 www.ietf.org.49 RFC 3935, ―A Mission Statement for the IETF,‖ October 2004, at 1.50 DiNardis, Internet Governance, at 7.51 RFC 3935, ―Mission Statement,‖ at 1-2. These processes stand as an interesting contrast to, say, theworkings of the U.S. Congress.52 RFC 1591, ―Domain Name System Structure and Delegation,‖ Jon Postel, at __. 10
  • the multi-stakeholder governance system that is the foundation of Internet governance.‖53The Request for Comments (RFCs) were first established by Steve Crocker of UCLA inApril 1969. The memos were intended as an informal means of distributing shared ideasamong network researchers on the ARPANET project.54 ―The effect of the RFCs was tocreate a positive feedback loop, so ideas or proposals presented in one RFC would triggerother RFCs.‖55 A specification document would be created once consensus cametogether within the governing organization (eventually IETF), and used as the basis forimplementation by various research teams. RFCs are now viewed as the ―document ofrecord‖ in the Net standards community,56 with over 6,000 documents now in existence.An RFC does not automatically carry the full status of a standard. Three types of RFCscan be promulgated: the proposed standard (specification and some demonstrated utility),draft standard (implementations and at least some limited operational capability), and thestandard itself (demonstrated operational capacity). A proposed or draft standard canonly become an actual standard once it has been readily accepted and used in the market.In fact, standards ultimately succeed or fail based on the response of the marketplace.57Other organizations involved in governing the Internet include W3C and ICANN. W3Cwas formed in 1994 to evolve the various protocols and standards associated with theWorld Wide Web. The body produces widely-available specifications, calledRecommendations, which describe the building blocks of the Web. ICANN was formedin 1998, in what Milton Mueller calls ―cyberspace‘s constitutional moment.‖58 Its role isto manage the unique system of identifiers of the Internet, including domain names,Internet addresses, and parameters of the Internet Protocol suite.As we shall see, the Internet‘s ―running code‖ is a reflection of its unique heritage: openstandards and public commons, rather than proprietary standards and private property.While much of its underlying physical networks and ―over-the-top‖ applications andcontent come from the commercial, privately-owned and -operated world, its logicalarchitectural platform does not. B. The Internet’s Designed ArchitectureComplex systems like the Internet can only be understood in their entirety by abstraction,and reasoned about by reference to principles.‖59 ―Architecture‖ is a high-level53 Doria, Study Report, at 19.54 Leiner et al, The Past and Future History of the Internet, at 106.55 Leiner et al, The Past and Future History of the Internet, at 106.56 Leiner et al, The Past and Future History of the Internet, at 106.57 Werbach, Higher Standard, at 199.58 MILTON L. MUELLER, RULING THE ROOT: INTERNET GOVERNANCE AND THE TAMINGOF CYBERSPACE (2002). He believes there is no suitable legal or organizational framework in place togovern the Net. Id. at __.59 Matthias Barwolff, End-to-End Arguments in the Internet: Principles, Practices, and Theory,Dissertation submitted to the Department of Electrical Engineering and Computer Science at TechnicalUniversitat Berlin, presented on October 22, 2010, at 133. 11
  • description of a complex system‘s organization of basic building blocks, its fundamentalstructures.60 How the Internet runs is completely dependent on the implementing code,its fundamental nature created and shaped by engineers.61 ―The Internet‘s value is foundin its technical architecture.‖62Technology mediates and gives texture to certain kinds of private relationships, weighingin on the side of one vested interest over others.63 ―Design choices frequently havepolitical consequences – they shape the relative power of different groups in society.‖64Or, put differently, ―technology has politics.‖65 Law and technology both have the abilityto organize and impose order on society.66 Importantly, ―technology design may be theinstrument of law, or it may provide a means of superseding the law altogether.‖67Langdon Winner may have put it best (in a pre-Internet formulation): ―The issues thatdivide or unite people in society are settled not only in the institutions and practices ofpolitics proper, but also, and less obviously, in tangible arrangements of steel andconcrete, wires and semiconductors, nuts and bolts.‖68 Indeed, ―like laws or socialnorms, architecture shapes human behavior by imposing constraints on those who interactwith it.‖69David Clark and others remind us that ―there is no such thing as value-neutral design.What choices designers include or exclude, what interfaces are defined or not, whatprotocols are open or proprietary, can have a profound influence on the shape of theInternet, the motivations of the players, and the potential for distortion of thearchitecture.‖70 Those responsible for the technical design of the early days of theInternet may not always have been aware that their preliminary, often tentative andexperimental, decisions were simultaneously creating enduring frames not only for theInternet, but also for their treatment of social policy issues. 71 As time went on, however,those decisions came more into focus. The IETF proclaims that ―the Internet isn‘t valueneutral, and neither is the IETF…. We embrace technical concepts such as decentralizedcontrol, edge-user empowerment, and sharing of resources, because those conceptsresonate with the core values of the IETF community.‖72It may well be true that ―engineering feedback from real implementations is more60 VAN SCHEWICK, INTERNET ARCHITECTURE AND INNOVATION (2010), at Part I, Chapter 1.61 Solum & Chung, Layers Principle, at 12.62 Searls and Weinberger, ―World of Ends,‖ at __.63 Nissenbaum, From Preemption to Circumvention, at 1375.64 Brown, Clark, and Trossen, Should Specific Values Be Embedded in the Internet Architecture, at 3.65 Nissenbaum, From Preemption to Circumvention, at 1377.66 Helen Nissenbaum, From Preemption to Circumvention: If Technology Regulates, Why Do We NeedRegulation (and Vice Versa)?, Berkeley Technology Law Journal, Vol. 26 No. 3 (2011), 1367, 1373.67 Braman, Defining Information Policy, at 4.68 Langdon Winner, Do Artifacts Have Politics, in THE WHALE AND THE REACTOR: A SEARCHFOR LIMITS IN AN AGE OF HIGH TECHNOLOGY (1986), at 19.69 VAN SCHEWICK, INTERNET ARCHITECTURE, at Part 1, Chapter 1.70 Clark, Sollins, Wroclawski, and Braden, Tussle in Cyberspace, at __.71 Braman, The Framing Years, at 29.72 RFC 3935, at 4. 12
  • important than any architectural principles.‖73 And yet, the fundamental design attributesof the Net have stood the challenge of time. Through several decades of design andimplementation, and several more decades of actual use and endless tinkering, these arethe elements that simply work. C. The Internet’s Four Design AttributesThe Internet is a network of networks, an organic arrangement of disparate underlyinginfrastructures melded together through common protocols. Understanding the what,where, why, and how of this architecture goes a long ways towards understanding howthe Net serves the role it does in modern society, and the many benefits (and somechallenges) it provides.It would be quite useful to come up with ―a set of principles and concerns that suffice toinform the problem of the proper placement of functions in a distributed network such asthe Internet.‖74 Data networks like the Internet actually operate at several differentlevels. Avri Doria helpfully divides up the world of Internet protocols and standards intothree buckets.75 First we have the general communications engineering principles,consisting of elements like simplicity, flexibility, and adaptability. Next we have thedesign attributes of the Internet, such as no top-down design, packet-switching, end-to-end transmission, layering, and the Internet Protocol hourglass. Finally we have theoperational resources, those naming and numbering features dedicated to carrying out thedesign principles; these include the Domain Name System (DNS), IP addressing, andAutonomous System Numbers.76 This paper will focus on Doria‘s second bucket of theNet‘s fundamental design attributes, the home of many of the Net‘s software protocols.DiNardis points out the key role played by protocol standards in the logical layers: ―The Internet works because it is universally based upon a common protocological language. Protocols are sometimes considered difficult to grasp because they are intangible and often invisible to Internet users. They are not software and they are not material hardware. They are closer to text. Protocols are literally the blueprints, or standards, that technology developers use to manufacture products that will inherently be compatible with other products based on the same standards. Routine Internet use involves the direct engagement of hundreds of standards….‖77Scholars differ on how to define and number the Net‘s design attributes. As mentioned,Doria identifies the lack of top-down design, packet-switching, end-to-end transmission,layering, and the Internet Protocol hourglass.78 Barwolff sees five fundamental design73 RFC 1958, at 4.74 Barwolff, End-to-End Arguments, at 135.75 Avti Doria, ―Study Report: Policy implications of future network architectures and technology,‖ Paperprepared for the 1st Berlin Symposium on Internet and Society, Conference Draft, October 2, 2011.76 DiNardis, ―Internet Governance,‖ at 3. She calls these ―Critical Internet Resources.‖77 DeNardis, ―Internet Governance,‖ at 6.78 Doria, Study Report, at 8-12. 13
  • principles of the Internet: end-to-end, modularity, best efforts, cascadability, andcomplexity avoidance. Bernbom comes up with five principles of distributed systems,network of networks, peer-to-peer, open standards, and best efforts.79 Barbara vanSchewick weighs in with her own short list of design principles: layering, modularity,and two forms (the narrow version and the strong version) of the end-to-end principle.80RFC 1958 probably best sums it up back in 1996: ―In very general terms, the communitybelieves that the goal is connectivity, the tool is the Internet Protocol, and the intelligenceis end to end rather than hidden in the network.‖81 And, one can add with confidence,modularity or layering is the logical scaffolding that makes it all work together. So,consistent with RFC 1958 and other sources, I come up with a list of four major designattributes: the structure of layering (the what), the goal of connectivity (the why), the toolof the Internet Protocol (the how), and the ends-based location of function (the where).82As with Doria‘s exercise, it is important at this point to separate out the actual networkfunction from both the impetus and the effect, even though both aspects are critical tounderstanding the function‘s role. Design attributes also are not the same as the actualnetwork instantiations, like DNS and packet-switching. My list may not be definitive,but it does seek to capture much of the logical essence of the Net. 1. The Law of Code: ModularityThe modular nature of the Internet describes the ―what,‖ or its overall structuralarchitecture. The use of layering means that functional tasks are divided up and assignedto different software-based protocol levels.83 For example, the ―physical‖ layers of thenetwork govern how electrical signals are carried over physical wiring; independently,the ―transport‖ layers deal with how data packets are routed to their correct destinations,and what they look like, while the ―application‖ layers control how those packets areused by an email program, web browser, or other user application or service.This simple and flexible system creates a network of modular ―building blocks,‖ whereapplications or protocols at higher layers can be developed or modified with no impact onlower layers, while lower layers can adopt new transmission and switching technologieswithout requiring changes to upper layers. Reliance on a modular system of layers greatlyfacilitates the unimpeded delivery of packets from one point to another. Importantly, thecreation of interdependent layers also creates interfaces between them. These stableinterfaces are the key features that allow each layer to be implemented to different ways.79 Bernbom, at 3-5.80 BARBARA VAN SCHEWICK, INTERNET ARCHITECTURE AND INNOVATION (2010), at __.Van Schewick sees layering as a special kind of modularity. I agree, which is why I refrain from assigningmodularity as a wholly separate design attribute. Modularity is more of a general systems elementapplicable in most data networks. For this paper, I use the two terms interchangeably.81 RFC 1958, at 2.82 See also Whitt and Schultze, Emergence Economics.83 Whitt and Schultze, Emergence Economics, at 256-257. 14
  • RFC 1958 reports that ―Modularity is good. If you can keep things separate, do it.‖84 Inparticular, layers create a degree of ―modularity,‖ which allows for ease of maintenancewithin the network. Layering organizes separate modules into a partially orderedhierarchy.85 This independence, and interdependence, of each layer creates a useful levelof abstraction as one moves through the layered stack. Stable interfaces between thelayers fully enable this utility. In particular, the user‘s ability to alter functionality at acertain layer without affecting the rest of the network can yield tremendous efficiencieswhen one seeks to upgrade an existing application (higher layer) that makes extensive useof underlying physical infrastructure (lower layer).86 So, applications or protocols athigher layers can be developed or modified with little or no impact on lower layers.87In all engineering-based models of the Internet, the fundamental point is that thehorizontal layers, defined by code or software, serve as the functional components of anend-to-end communications system. Each layer operates on its own terms, with its ownunique rules and constraints, and interfaces with other layers in carefully defined ways. 88 2. Smart Edges: End-to-EndThe end-to-end (―e2e‖) design principle describes the ―where,‖ or the place for networkfunctions to reside in the layered protocol stack. The general proposition is that the coreof the Internet (the network itself) tends to support the edge of the Internet (the end userapplications, content, and other activities).89 RFC 1958 states that ―the intelligence is endto end rather than hidden in the network,‖ with most work ―done at the fringes.‖ 90 Somehave rendered this broadly as dumb networks supporting smart applications.91 A moreprecise technical translation is that a class of functions generally can be more completelyand correctly implemented by the applications at each end of a network communication.By removing interpretation of applications from the network, one also vastly simplifiesthe network‘s job: just deliver IP packets, and the rest will be handled at a higher layer.In other words, the network should support generality, as well as functional simplicity.92The e2e norm/principle arose in the academic communities of the 1960s and 1970s, andonly managed to take hold when the US Government compelled adoption of the TCP/IP84 RFC 1958, at 4. On the other hand, some forms of layering (or vertical integration) can be harmful if thecomplete separation of functions makes the network operate less efficiently. RFC 3439, at 7-8.85 VAN SCHEWICK, INTERNET ARCHITECTURE, at Part II, Ch. 2.86 Whitt, Horizontal Leap, at 604.87 Whitt and Schultze, Emergence Economics, at 257.88 Whitt, Horizontal Leap, at 602. In the ―pure‖ version of layering, a layer is allowed to use only the layerimmediately below it; in the ―relaxed‖ version, a layer is permitted to utilize any layer that lies beneath it.VAN SCHEWICK, INTERNET ARCHITECTURE, at Part II, Ch. 2. The Internet uses a version ofrelaxed layering, with IP acting as the ―portability‖ layer for all layers above it. Id.89 Whitt and Schultze, Emergence Economics, at 257-58.90 RFC 1958, at 2, 3.91 David Isenberg, id., at n.194.92 As Van Schewick puts it, e2e requires not ―stupidity‖ or simplicity in the network core, but that networkfunctions need only be general in order to support a wide range of functions in the higher layers. VANSCHEWICK, INTERNET ARCHITECTURE, at Part II, Ch. 4. One wonders if excessive talk about―stupid networks‖ has made these basic concepts more contentious for some than they need to be. 15
  • protocols, mandated a regulated separation of conduit and content, and grantednondiscriminatory network access to computer device manufacturers and dial-up onlinecompanies. These authoritative ―nudges‖ pushed the network to the e2e norm.93Consequently end-to-end arguments ―have over time come to be widely considered thedefining if vague normative principle to govern the Internet.‖94While end-to-end was part of Internet architecture for a number of years prior, theconcept was first identified, named, and described by Jerome Saltzer, David Reed, andDavid Clark in 1981.95 The simplest formulation of the end-to-end principle is thatpackets go into the network, and those same packets come out without change, and that isall that happens in the network. This formulation echoes the 1974 Cerf/Kahn paper,which describes packet encapsulation as a process that contemplates no change to thecontents of the packets.The e2e principle suggests that specific application-level functions ideally operate on theedges, at the level of client applications that individuals set up and manipulate.96 Bycontrast, from the network‘s perspective, shared ignorance is built into the infrastructurethrough widespread compliance with the end-to-end design principle.97 In addition, andcontrary to some claims, e2e is not really neutral; it effectively precludes prioritizationbased on the demands of users or uses, and favors one set of applications over another.98The e2e principle also generally preferences reliability, at the expense of timeliness.The concept of e2e design is closely related to, and provides substantial support for, theconcept of protocol layering.99 End-to-end tells us where to place the network functionswithin a layered architecture.100 In fact, end-to-end guides how functionality isdistributed in a multilayer network, so that layering must be applied first.101 Both relateto the overall general design objectives of keeping the basic Internet protocols simple,general, and open.102With regard to the Internet, the end-to-end argument now has been transformed into abroader principle to make the basic Internet protocols simple, general, and open, leavingthe power and functionality in the hands of the application.103 Of course, the e2e93 Whitt, Broadband Policy, at 507-08.94 Barwolff, at 134.95 VAN SCHEWICK, INTERNET ARCHITECTURE, at Part II, Ch. 2.96 Whitt and Schultze, Emergence Economics, at 258.97 FRISCHMANN, INFRASTRUCTURE (2012), at 322.98 FRISCHMANN, INFRASTRUCTURE, at 324, 326. Richard Bennett, a persistent critic of the ―mythicalhistory‖ and ―magical properties‖ of the Internet, claims that end-to-end arguments ―don‘t go far enough topromote the same degree of innovation in network-enabled services as they do for network-independentapplications.‖ Richard Bennett, ―Designed for Change: End-to-End Arguments, Internet Innovation, andthe Net Neutrality Debate,‖ ITIF, September 2009, at 27, 38. To the extent this is a true statement, itreflects again that the Internet‘s many founding engineers made deliberate design choices.99 Whitt, Horizontal Leap, at 604-05.100 Whitt, Horizontal Leap, at 604-05.101 VAN SCHEWICK, INTERNET ARCHITECTURE, at Part I, Ch. 2.102 Whitt, Horizontal Leap, at 605.103 Whitt and Schultze, Emergence Economics, at 259. 16
  • principle can be prone to exaggeration, and there are competing versions in the academicliterature.104 One cannot have a modern data network without a core, and in particularthe transport functionality to connect together the myriad constituents of the edge, as wellas the widespread distribution of the applications and content and services provided bythe edge. Elements of the core network, while erecting certain barriers (such as firewallsand traffic shaping) that limit pure e2e functionality,105 may still allow relativelyunfettered user-to-user connectivity at the applications and content layers. To have a fullyfunctioning network, the edge and the core need each other. And they need to beconnected together. 3. A Network of Networks: InterconnectionRFC 1958 puts it plainly: the goal of the Internet is ―connectivity.‖106 The Internet has aphysical architecture as much as a virtual one.107 Unlike the earlier ARPANET, theInternet is a collection of IP networks owned and operated in part by privatetelecommunications companies, and in part by governments, universities, individuals,and other types of entities, each of which needs to connect together.108 Kevin Werbachhas pointed out that connectivity is an often under-appreciated aspect of Internetarchitecture.109 ―The defining characteristic of the Net is not the absence ofdiscrimination, but a relentless commitment to interconnectivity.‖110Jim Speta agrees that the Internet‘s utility largely depends on ―the principle of universalinterconnectivity, both as a technical and as an economic matter.‖111 In order to becomepart of the Internet, owners and operators of individual networks voluntarily connect tothe other networks already on the Internet. This aspect of the Net goes to its ―why,‖which is the overarching rationale of moving traffic from Point A to Point B. The earlyInternet was designed with an emphasis on internetworking and interconnectivity, andmoving packets of data transparently across a network of networks. Steve Crockerreports that in a pre-Internet environment all hosts would benefit from interconnectingwith ARPANET, but that ―the interconnection had to treat all of the networks with equalstatus‖ with ―none subservient to any other.‖112104 Barbara van Schewick devotes considerable attention to the task of separating out what she calls the―narrow‖ and ―broad‖ versions of the end-to-end principle. VAN SCHEWICK, INTERNETARCHITECTURE, at Part II, Ch. 1. She finds ―real differences in scope, content, and validity‖ betweenthe two, and plausibly concludes that the Net‘s original architecture was based on the broader version thatmore directly constrains the placement of functions in the lower layers of the network. Id. As oneexample, the RFCs and other IETF documents are usually based on the broad version. Id. at Part II, Ch. 2.For purposes of this paper however we need only recognize that some form of the end-to-end concept hasbeen firmly embedded in the Net as one of its chief design attributes.105 Whitt, Broadband Policy, at 453 n.199.106 RFC 1958, at 2.107 Laura DeNardis, ―The Emerging Field of Internet Governance,‖ Yale Information Society ProjectWorking Paper Series, September 2010, at 12.108 DeNardis, ―Internet Governance,‖ at 12.109 Whitt and Schultze, Emergence Economics, at 259 n.205.110 Kevin Werbach, quoted in Whitt, Broadband Policy, at 522.111 Jim Speta, quoted in Whitt, Broadband Policy, at 522.112 Crocker, ―Where Did the Internet Really Come From?‖ 17
  • Today‘s Internet embodies a key underlying technical idea: open-architecturenetworking. Bob Kahn first articulated this concept of open architecture in 1972, and itbecame the basis for later Internet design. Under this design principle, network providerscan freely interwork with other networks through ―a meta-level ‗internetworkingarchitecture.‘‖113 Critical ground rules include that each distinct network must stand onits own, communications will be on a best-effort basis, and there is no global control atthe operations level.114 Impetus for the best efforts concept then is the desire for as manydifferent networks as possible to voluntarily connect, even if strong guarantees of packetdelivery were not possible.The Internet‘s goal of open and voluntary connectivity requires technical cooperationbetween different network service providers.115 Networks of all types, shapes, and sizesvoluntarily choose to interoperate and interconnect with other networks. They do so byagreeing to adopt the Internet‘s protocols as a way of passing data traffic to and fromother entities on the Internet. For example, it has always been legally and technicallypermissible for a private network, such as a broadband operator, to opt out by ceasing tooffer Internet access or transport services – choose to reject TCP/IP – and instead provideonly proprietary services.116 So, ―if you want to put a computer – or a cell phone or arefrigerator – on the network, you have to agree to the agreement that is the Internet.‖117Bernbom says that the key elements of the networks of networks include ―the autonomyof network participants, the rule-based requirements for interconnection, and the peer-to-peer nature of interconnection agreements.‖118 Ignoring one or more of these elementsdiminishes or even eliminates a network‘s interoperability with the rest of the Internet.119In their recent book Interop, Palfrey and Glasser observe that ―the benefits and costs ofinteroperability are most apparent when technologies work together so that the data theyexchange prove useful at the other end of the transaction.‖120 Without interoperability atthe lower layers of the Internet, interoperability at the higher layers – the human andinstitutional layers – is often impossible.121 Their concept of ―interop‖ is to ―embracecertain kinds of diversity not by making systems, applications, and components the same,but by enabling them to work together.‖122 If the underlying platforms are open anddesigned with interoperability in mind, then all players – including end users andintermediaries – can contribute to the development of new products and services.123113 Leiner et al, Past and Future History of the Internet, at 103.114 Leiner et al, at 103-04.115 RFC 1958, at 2.116 FRISCHMANN, INFRASTRUCTURE, at 345.117 Doc Searls and David Weinberger, World of Ends, March 10, 2003, quoted in Whitt, Broadband Policy,at 504 n.515.118 Bernbom, at 5. Interestingly he ties these same elements to maintaining the Internet as a commons. Id.119 Bernbom, at 23.120 INTEROP, THE PROMISE AND PERILS OF HIGHLY INTERCONNECTED SYSTEMS, PALFREYAND GLASSER (2012), at 22.121 INTEROP, at 23.122 INTEROP, at 108.123 INTEROP, at 121. Standards processes play a particularly important role in getting to interoperability. 18
  • Interconnection agreements are unseen, ―in that there are no directly relevant statutes,there is no regulatory oversight, and there is little transparency in private contracts andagreements.‖124 The fundamental goal is that the Internet must be built byinterconnecting existing networks, so employing ―best efforts‖ as the baseline quality ofservice for the Internet makes it easier to interconnect a wide variety of network hardwareand software.125 This also facilitates a more robust, survivable network of networks, or―assured continuity.‖126 As a result, ―the best effort principle is embedded in today‘sinterconnection agreements across IP-networks taking the for of transit and peeringagreements.‖127We must not overlook the obvious financial implications of interconnecting disparatenetworks. ―Interconnection agreements do not just route traffic in the Internet, they routemoney.‖128 A healthy flow of money between end users and access ISPs is important tosustain infrastructure investment, consistent with concerns about potential market powerabuses.129 Traditionally interconnection agreements on the backbone were party of arelatively informal process of bargaining.130 Transiting is where the ISP provides accessfor the entire Internet for its customers; peering is where two ISPs interconnect toexchange traffic on a revenues-neutral basis. The changing dynamics of Netinterconnection economics include paid peering between content delivery networks(CDNs) and access ISPs.131Interconnecting then is the baseline goal embedded in the Internet‘s architecture, creatingincentives and opportunities for isolated systems to come together, and for edges tobecome embedded in tightly interconnected networks. Werbach has shown thatinterconnectivity creates both decentralizing and centralizing trends in the Interneteconomy, and both centripetal force (pulling networks and systems into the Internetcommons) and centrifugal force (towards the creation of isolated gated communities).132Thus far, however, ―the Internet ecosystem has managed to adapt IP interconnectionarrangements to reflect (inter alia) changes in technology, changes in (relative) marketpower of players, demand patterns, and business models.‖133At least 250 technical interoperability standards are involved in the manufacture of the average laptopcomputer produced today. INTEROP, at 160, 163.124 DeNardis, ―Internet Governance,‖ at 13.125 Solum, Layers Principle, at 107.126 Solum, at 107.127 BEREC, IP-interconnection, at 5.128 David Clark, William Lehr, Steven Bauer, Interconnection in the Internet: the policy challenge,prepared for TPRC, August 9, 2011, at 2.129 Clark, Lehr, and Bauer, Interconnection in the Internet, at 2.130 Clark, Lehr, and Bauer, Interconnection in the Internet, at 3.131 Clark, Lehr, and Bauer, Interconnection in the Internet, at 2. See also BEREC, IP-interconnection, at 48(the emergence of CDNs and regional peering together have resulted in a reduced role for IP transitproviders).132 Whitt and Schultze, Emergence Economics, at 260 n.209; see also Whitt, Broadband Policy, at 522.133 BEREC, An assessment of IP-interconnection in the context of Net Neutrality, Draft report for publicconsultation, May 29, 2012, at 48. 19
  • 4. Agnostic Protocols: IPRFC 1958 calls IP ―the tool‖ for making the Internet what it is.134 The design of theInternet Protocol (―IP‖), or the ―how,‖ allows for the separation of the networks from theservices that ride on top of them. IP was designed to be an open standard, so that anyonecould use it to create new applications and new networks. By nature, IP is completelyindifferent to both the underlying physical networks, and to the countless applicationsand devices using those networks. In particular, IP does not care what underlyingtransport is used (such as fiber, copper, cable, or radio waves), what application it iscarrying (such as browsers, e-mail, Instant Messaging, or MP3 packets), or what contentit is carrying (text, speech, music, pictures, or video). Thus, IP enables any and all userapplications and content. ―By strictly separating these functions across a relatively simplyprotocol interface the two parts of the network were allowed to evolve independently butyet remain connected.‖135In 1974, Vint Cerf and Robert Kahn issued their seminal paper on the TCP protocol suite,in which the authors ―present a protocol design and philosophy that supports the sharingof resources that exist in different packet switching networks.‖136 IP later was split off in1977 to facilitate the different functionality of the two types of protocols. Based in largepart on how Cerf and Kahn designed the Internet architecture, the Internet Protocol hasbecome a wildly successful open standard that anyone can use. By 1990, whenARPANET was finally decommissioned, TCP/IP had supplanted or marginalized otherwide-area computer network protocols worldwide, and the IETF has overseen furtherdevelopment of the protocol suite. Thus, IP was on the way to becoming the bearerservice for the Net.137TCP and IP make possible the Net‘s design as general infrastructure. 138 IP is the singleprotocol that constitutes the ―Internet layer‖ in the OSI stack, while TCP is one of theprotocols in the ―transport layer.‖ To higher layers, IP provides a function that isconnectionless (each datagram is treated independent from all others) and unreliable(delivery is not guaranteed) between end hosts. By contrast, TCP provides a reliable andconnection-oriented continuous data stream within an end host.139 IP also provides bestefforts delivery because, although it does its best to deliver datagrams it does not provideany guarantees regarding delays, bandwidth, or losses.140On the Internet, TCP/IP are the dominant uniform protocols (UDP is a parallel to TCP,and heavily used today for streaming video and similar applications). Because they arestandardized and non-proprietary, the things we can do on top of them are incrediblydiverse. ―The system has standards at one layer (homogeneity) and diversity in the ways134 RFC 1958, at 2.135 Doria, Study Report, at 11.136 Whitt and Schultze, Emergence Economics, at n.212.137 Leiner et al, The Past and Future History of the Internet, at 106.138 Leiner et al, The Past and Future History of the Internet, at 104.139 VAN SCHEWICK, INTERNET ARCHITECTURE, at Part II, Ch. 4.140 VAN SCHEWICK, INTERNET ARCHITECTURE, at Part II, Chapter 4. 20
  • that ordinary people care about (heterogeneity).‖141About the ARPANET, RFC 172 states that ―assume nothing about the information andtreat it as a bit stream whose interpretation is left to a higher level process, or a user.‖ 142That design philosophy plainly carried over to the Internet. As Barwolff puts it, IPcreates ―the spanning layer‖ that creates ―an irreducibly minimal coupling between thefunctions above and below itself.‖143 Not only does IP separate the communicationspeers at either end of the network, it generally maintains a firm separation between theentities above and below it.144 This is another example of how two discrete elements, inthis case modular design and agnostic protocols, work closely together to create adistinctive set of network functions. IP also interconnects physical networks throughrouters in the networks. Moreover, Frischmann believes that TCP/IP actually implementsthe end-to-end design.145 C. The End Result: A Simple, General, and Open Feedback NetworkThese four fundamental architectural components of the Internet are not standalones orabsolutes; instead, they each exist, and interact, in complex and dynamic ways, along acontinuum. Together the four design attributes can be thought of as creating a networkthat is, in David Clark‘s formulation, simple, general, and open.146 At the same time, thedifferent layers create the logical traffic lanes through which the other three attributestravel and are experienced. So in that one sense modularity provides the Internet‘sfoundational superstructure.We must keep in mind that these four attributes describe the Internet in its nativeenvironment, with no alterations or impediments imposed by other agents in the largerecosystem. Where laws or regulations, or other activities, would curtail one or more ofthe design attributes, the Net becomes less than the sum of its parts. It is only when thedesign features are able to work together that we see the full emergent phenomenon ofthe Net. In this paper, I will use the term ―integrity‖ to describe how the design elementsfit together and function cohesively to create the user‘s overall experience of the Internet.Every design principle, instantiated in the network, has its drawback, and itscompromises. Technical improvements are a given.147 The Internet certainly could bemore simple (or complex), more general (or specialized), more open (or closed). 148 Nor141 INTEROP, at 108.142 RFC 172, at 6.143 Barwolff, at 136.144 Barwolff, at 137.145 FRISCHMANN, INFRASTRUCTURE, at 320. Frischmann also observes that the e2e concept is foundin many infrastructure systems. See also Whitt and Schultze, Emergence Economics, at 260-61 (IP wasdesigned to follow the e2e principle).146 [citation]147 Whitt, Broadband Policy, at 453.148 As one example, David Clark reports that the decision to impose the datagram model on the logicallayers deprived them of an important source of information which they could use in achieving the lowerlayer goals of resource management and accountability. Clark, Design Philosophy, at 113. Certainly afuture version of the Net could provide a different building block for the datagram. Id. 21
  • is the Internet an absolutely neutral place, a level playing field for all comers.The design features reinforce one another. For example, the layering attribute is relatedto the end-to-end principle in that it provides the framework for putting functionality at arelative edge within the network‘s protocol stack.149 RFC 1958 states that keeping thecomplexity of the Net at the edges is ensured by keeping the IP layer as simple aspossible.150 Putting IP in a central role in the Internet is also ―related loosely tolayering.‖151 At the same time, the ―best efforts‖ paradigm is ―intrinsically linked‖ to thenature of IP operating in the transmission network,152 because IP defines passing packetson a best efforts basis.153 Further, ―TCP/IP defines what it means to be part of theInternet.‖154 Certainly the combination of the four design attributes has allowed endusers to utilize the Net as a ubiquitous platform for their activities.155The end result is that IP helps fashion what come have called a ―virtuous hourglass‖ fromdisparate activities at the different network layers. In other words, the Net drivesconvergence at the IP (middle) layer, while at the same time facilitating divergence at thephysical networks (lower) and applications/content (upper) layers. The interconnectednature of the network allows innovations to build upon each other in self-feeding loops.In many ways layering is the key element that ties it all together.As the networks and users that comprise it continue to change and evolve, the Net‘s coreattributes of modularity, e2e, interconnectivity, and agnosticism are constantly beingpushed and prodded by technology, market, and legal developments. That is not to saythese developments inherently are unhealthy. Clearly there are salient exceptions to everyrule, if not new rules altogether. The Internet needs to be able to adjust to the realities ofsecurity concerns like DoS attacks, and the needs of latency-sensitive applications likestreaming video and real-time gaming. The question is not whether the Net will evolve,but how.156III. Internet Design as Foundational, Dynamic, and CollectiveThe first part of this paper seeks to respond to a basic question: What types oftechnologies and technical design features afford the greatest potential to drive userbenefits? In the previous section we took a relatively micro view, attempting to isolateand describe the Internet‘s four fundamental design attributes. This section takes thediscussion to a more macro level perspective on the Internet as a whole. Depending onwho you ask – a technologist, a scientist, or an economist – the answer to that question isthe same: the Internet. Whether as a general platform technology, a complex adaptive149 Doria, Study Report, at 11.150 RFC 1958, at 3. The layer principle is related to, but separate from, the broad version of the end-to-endprinciple. VAN SCHEWICK, INTERNET ARCHITECTURE, at Part II, Chapter 4.151 Doria, Study Report, at 11.152 BEREC, IP-interconnection, at 4.153 SEARLS, INTENTION ECONOMY, at Chapter 9. ―Best effort‖ is what IP requires. Id. at Chapter 14.154 Werbach, Breaking the Ice, at 194.155 Whitt and Schultze, Emergence Economics, at 300-01.156 Whitt and Schultze, Emergence Economics, at 262. 22
  • system, or a common pool resource, the Net serves as the ideal platform to promote andenhance a myriad of human activities. In brief the Net‘s basic design enables massivespillovers, emergent phenomena, and shared resources. A. The Internet as General Platform TechnologyOne potential answer to that question has been articulated in the ongoing research on―General Purpose Technologies‖ (―GPTs‖). A GPT is a special type of technology thathas broad-ranging enabling effects across many sectors of the economy. Technologiststypically define a GPT as a generic technology that eventually comes to be widely used,to have many uses, and to have many spillover effects.157The foundational work on GPTs was first published by Timothy Bresnahan and ManuelTrajtenberg in 1992. They describe how this particular type of technology is most likelyto generate increasing returns in line with economist Paul Romer, with growth comingfrom specific applications that depend on ideas in the ―general‖ layer of technology.Specifically, GPTs play a role of ―enabling technologies‖ by opening up newopportunities rather than offering complete, final solutions. The result, as they found it, is―innovational complementarities,‖ meaning that ―the productivity of R&D in adownstream sectors increases as a consequence of innovation in the GPT technology.These complementarities magnify the effects of innovation in the GPT, and helppropagate them throughout the economy.‖158The Internet has been labeled a GPT, with ―the potential to contribute disproportionatelyto economic growth‖ because it generates value ―as inputs into a wide variety ofproductive activities engaged in by users.‖159 Concurrently the Net is an infrastructureresource that enables the production of a wide variety of private, public, and socialgoods.160 As the early Net pioneers see it, ―The Internet was not designed for just oneapplication but as a general infrastructure on which new applications could be conceived,exemplified later by the emergence of the Web. The general-purpose nature of theservice provided by TCP and IP made this possible.‖161The GPT literature demonstrates that Internet technologies share key features of a GPT,all of which help make the Net an enabling technology. These features include:widespread use across key sectors of the economy and social life; represents a great scopeof potential improvement over time; facilitates innovations generating new products andprocesses; and shows strong complementarities with existing and emergingtechnologies.162By its nature a GPT maximizes the overall utility to society. Lipsey observes that GPTs157 Whitt and Schultze, Emergence Economics, at 276.158 Whitt and Schultze, Emergence Economics, at 276 n.289.159 Whitt and Schultze, Emergence Economics, at 277 n.294.160 FRISCHMANN, INFRASTRUCTURE, at 334.161 Leiner et al, The Past and Future History of the Internet, at 104.162 Hofmokl, at 241. 23
  • help ―rejuvenate the growth process by creating spillovers that go far beyond the conceptof measurable externalities,‖ and far beyond those agents that initiated the change.163 Thishas important implications when trying to tally the sum total of beneficial value andactivity generated by the Internet.164 GPT theory emphasizes ―the broadercomplementarity effects of the Internet as the enabling technology changing thecharacteristics, as well as the modes of production and consumption of many othergoods.‖165Perhaps the most important policy-related takeaway about GPTs is that keeping them―general‖ is not always in the clear interest of firms that might seek to control them. Acorporation might envision greater profits or efficiency through making a tremendouslyuseful resource more scarce, by charging much higher than marginal cost, or bycustomizing solely for a particular application. While these perceptions might be true inthe short term, or for that one firm‘s profits, they can have devastating effects for growthof the economy overall. The more general purpose the technology, the greater are thegrowth-dampening effects of allowing it to become locked-down in the interest of aparticular economic agent.166 The important feature of generative platforms, such as theInternet, is that users easily can do numerous things with them, many of which may nothave been envisioned by the designers. If, for example, the Internet had been built solelyas a platform for sending email, and required retooling to do anything else, mostapplications and business models never would have developed.167 B. The Internet as Complex Adaptive SystemIn addition to serving as a GPT, the Internet is also a complex adaptive system (―CAS‖),whose architecture is much richer than the sum of its parts. As such, the microinteractions of ordinary people on the Internet lead to macro structures and patterns,including emergent and self-organizing phenomena.Complexity can be architectural in origin. It is believed that the dense interconnectionswithin the ―network of networks‖ produce the strongly non-linear effects that are difficultto anticipate.168 Engineers understand that more complex systems like the Internetdisplay more non-linearities; these occur (are ―amplified‖) at large scales which do notoccur at smaller scales.169 Moreover, more complex systems often exhibit increasedinterdependence between components, due to ―coupling‖ between or within protocollayers.170 As a result, global human networks linked together by the Internet constitute acomplex society, where more is different.171163 Whitt and Schultze, Emergence Economics, at 280.164 Whitt and Schultze, Emergence Economics, at 279-280.165 Hofmokl, The Internet commons, at 241.166 Whitt and Schultze, Emergence Economics, at 277.167 Whitt and Schultze, Emergence Economics, at 277-78.168 de La Chappelle, Multistakeholder Governance, at 16.169 RFC 3439, at 4.170 RFC 3439, at 5.171 de La Chappelle, at 17. 24
  • As scientists are well aware, emergence is not some mystical force that magically comesinto being when agents collaborate. Emergent properties are physical aspects of a systemnot otherwise exhibited by the component parts. They are macro-level features of asystem arising from interactions among the system‘s micro-level components, bringingforth novel behavior. Characteristics of emergent systems include micro-macro effects,radial symmetry, coherence, interacting parts, dynamical, decentralized control, bi-directional links between the macro- and micro- levels, and robustness and flexibility.172The brain is an example of a CAS: the single neuron has no consciousness, but a networkof neurons brings forth, say, the perception of and appreciation for the smell of a rose.Similarly, when agents interact through networks, they evolve their ways of doing workand discover new techniques. Out of this combined activity, a spontaneous structureemerges. Without any centralized control, emergent properties take shape based on agentrelationships and the conditions in the overall environment. Thus, emergence stems frombehavior of agents, system structures, and exogenous inputs.173Emergent systems exist in an ever-changing environment and consist of complexinteractions that continuously reshape their internal relationships. The many independentactions of agents unify, but they do not necessarily work toward one particular structureor equilibrium. For example, emergent systems can be robust to change, and they can befar better at evolving toward efficiency than top-down systems. On the other hand,emergent structures can fall apart when their basic conditions are altered in such a waythat they work against the health of the system as a whole. The line between emergence-fostering actions and emergence-stifling actions sometimes can be difficult to discern.174 C. The Internet as Common Pool ResourceA third perspective on the Internet comes to us from modern economic theory, where theNet is seen by many as a ―common pool resource‖ or CPR. The term ―commons‖ has hadmany uses historically, almost all contested.175 Elinor Ostrom has defined it simply as ―aresource shared by a group of people and often vulnerable to social dilemmas.‖176 YochaiBenkler states that ―the commons refer to institutional devices that entail governmentabstention from designating anyone as having primary decision-making power over useof a resource.‖177 The two principal characteristics that have been utilized widely in theanalysis of traditional commons are non-excludability and joint (non-rivalrous)consumption.178172 Whitt and Schultze, Emergence Economics, at 248 n.141.173 Whitt and Schultze, Emergence Economics, at 247-48.174 Whitt and Schultze, Emergence Economics, at 248-49.175 Hess and Ostrom, Ideas, Artifacts, and Facilities, at 115.176 Hofmokl, The Internet commons, at 229, quoting Ostrom (2007), at 349.177 Benkler, The Commons as a Neglected Factor of Information Policy, Remarks at TPRC (September1998).178 Justyna Hofmokl, The Internet commons: towards an eclectic theoretical framework, InstitutionalJournal of the Commons, Vol. 4, No. 1, 226 (February 2010), at 227. See also LEWIS HYDE, COMMONAS AIR (20 ) (Commons is a kind of property – including the rights, customs, and institutions thatpreserve its communal use – in which more than one person has rights). 25
  • In turn, a CPR has been defined as ―a natural or man-made resource system that issufficiently large as to make it costly (but not impossible) to exclude potentialbeneficiaries from obtaining benefits for its use.‖179 Ostrom and Hess have concludedthat cyberspace is a CPR, similar as a resource to fishing grounds, grazing lands, ornational security that is constructed for joint use. Such a system is self-governed andheld together by informal, shared standards and rules among a local and global technicalcommunity. This is so even as the resource units themselves – in this case data packets –typically are individually owned.180Frischmann points out that traditional infrastructures are generally managed as commons,which well fits their role as a shared means to many ends.181 In the United States andelsewhere, government traditionally plays the role of ―provider, subsidizer, coordinator,and/or regulator‖ of infrastructure.182 Studies written about the Internet as a CPR tend tofocus on the technology infrastructure and the social network issues, rather than theinstitutions developed about the distributed information per se.183 However, Bernbombelieves that many of the design elements of the Internet create the basic rules formanaging it as a commons.184Much of the thinking about the Internet as a CPR glosses over the functional specifics,which is a correctable mistake. Contrary to how some describe its resource role, theInternet is actually a complicated blend of private goods and public goods, with varyingdegrees of excludability and joint consumption. Hofmokl does an excellent job analyzingthe physical, logical, and content layers of the Internet, pointing out where its attributesas a resource match up to those of the commons.185 In particular the physical layers(physical networks and computing devices) and the content layers (digitized information)are mostly pure private goods, showing excludability combined with rivalrousconsumption.186However, when we are talking about the design attributes of the Internet, the elements weare focused on – the technical standards and protocols, including TCP-IP-HTTP, thatdefine how the Internet and World Wide Web function -- all constitute exclusively publicgoods, free for everyone to use without access restrictions.187 Further, as Frischmannduly notes, the Net‘s logical infrastructure – the open, shared protocols and standards --179 ELINOR OSTROM, GOVERNING THE COMMONS: THE EVOLUTION OF INSTITUTIONS FORCOLLECTIVE ACTION (1990), at __.180 Hess and Ostrom, Ideas, Artifacts, and Facilities, at 120-21.181 FRISCHMANN, INFRASTRUCTURE, at 3-4.182 FRISCHMANN, INFRASTRUCTURE, at 4.183 Hess and Ostrom, Ideas, Artifacts, and Facilities: Information as a Common-Pool Resource, Law andContemporary Problems, Vol. 66:111 (Winter/Spring 2003), at 128.184 Gerald Bernbom, Analyzing the Internet as a Common Pool Resource: The Problem of NetworkCongestion, Pre-Conference Draft, International Association for the Study of Common Property, IASCP2000, April 29, 2000, at 5.185 Hofmokl, The Internet commons, at 232-238.186 Hofmokl, The Internet commons, at 232-238. Hofmokl calls this ―a dual structure within the Internet, ofcommercial and free access segments.‖ Id. at 232.187 Hofmokl, The Internet commons, at 235-36. 26
  • are managed as commons.188 So the key architectural components of the Net constitute acommon pool resource, managed as a commons, even if many of the network‘s actualcomponent parts – individual communications networks, proprietary applications andcontent, etc. – are private goods, or a blend of private and public goods.189 The Net‘sdesign attributes are what make it a commons resource. *********************Like a funhouse mirror, each of the three cross-functional perspectives sketched outabove are a partially correct reflection of reality, and yet remain incomplete without theothers. As a GPT, the Net serves a vital function as a general, foundational platform formany people. As a CAS, the Net presents emergent properties, often with dynamic andunanticipated consequences. As a CPR, the Net provides a shared resource for all toutilize for a mix of purposes and ends. From its micro-functions, the Internet generatesthe macro-phenomena of a general platform, a complex system, and a shared resource.IV. The Emergence of Net Effects: Benefits and ChallengesIt seems almost a truism to point out that the Internet on whole has done some very goodthings for modern society. In his recent book Infrastructure, Brett Frischmann does anadmirable job explicating a lengthy list of such benefits.190 The more interesting point isto figure out exactly why that would be the case. Using the power of abductive reasoning(from effect to cause),191 we can determine that the very design attributes outlined abovehave led to a raft of benefits for users, as well as some challenges. In other words, wecan safely come to the presumption that the modular, end-to-end, interconnected, andagnostic Internet provides real economic and social ―spillovers‖ value.192 That means theInternet‘s social returns exceed its private returns, because society realizes benefits aboveand beyond those realized by individual network providers and users.193Frischmann explains in some detail precisely how infrastructure generates such spilloversthat results in large social gains.194 In particular, managing the Internet‘s infrastructure asa commons sustains a spillovers-rich environment.195 Here are a few of the moreimportant economic and social and personal gains from the Internet‘s design attributes.188 FRISCHMANN, INFRASTRUCTURE, at 320 n.10.189 For example, Bernbom divides the Internet into the network commons, the information commons, andthe social commons. However, the notion that each of the Net‘s resources has the characteristics of a CPRseems to ignore the private goods nature of many of them. See Bernbom, Analyzing the Internet as aCommon Pool Resource, at 1-2.190 FRISCHMANN, INFRASTRUCTURE, at 317.191 DANIEL W. BROMLEY, SUFFICIENT REASON, VOLITIONAL PRAGMATISM AND THEMEANING OF ECONOMIC INSTITUTIONS (2006).192 Whitt and Schultze, Emergence Economics, at 297.193 FRISCHMANN, INFRASTRUCTURE, at 12.194 FRISCHMANN, INFRASTRUCTURE, at 5.195 FRISCHMANN, INFRASTRUCTURE, at 318. 27
  • A. Engine of InnovationIdeas are the raw material for innovation. In the ordinary transformational cycle, ideasbecome concepts, which become inventions, which are utilized for commercial or otherpurposes. They are the recipes for combining atoms and bits into useful things. While thephysical components are limited, the ideas themselves essentially are unlimited –characterized by increasing returns, continued re-use, and ease of sharing. Innovation, bycontrast, is the application of ideas—invention plus implementation. Ideas and innovationform an essential feedback cycle, where input becomes output, becomes input again.196If there is any one business lesson of the last decade that has acquired near-universalempirical support and expert agreement, it is this: innovation is a good thing. Thecreation of new and different objects, processes, and services are at the heart of anyrational conception of economic growth and the fulfillment of human potential. Nomatter what you call it—creativity, entrepreneurism, novelty, ingenuity—the globaleconomy feeds on the constant infusion of the products of innovation.197 Theproliferation of new ideas and inventions, channeled through generative networks ofagents, provides powerful fuel for economic growth and other important emergenteffects.198Network architectures affect economic systems, by both enabling and constrainingcertain behaviors.199 Barbara van Schewick has explained the strong linkage between theway the Net has been constructed – including modularity, layering, and a broad versionof the end-to-end principle -- and the prevalence of innovation.200 Not surprisingly, theInternet as a networked platform helps enable all the attributes of an innovativeenvironment. Generally speaking, a greater ability of agents to connect and explore newmodes of production will facilitate the contingent connections that a top-down designerlikely will not foresee. Better global information sharing and feedback between agentsfacilitates better local decisions. The system as a whole can take a leap forward whennew innovations emerge from this process and are replicated throughout the network bywilling agents. As a result, the Internet serves as a particularly effective innovationengine, rapidly developing, diffusing, and validating scores of novel inventions.Indeed, numerous empirical studies show conclusively the types of institutionalorganizational and networked environments within which innovation actually thrives. Inbrief, innovation tends to flow from:-- the users, not the consumers or providers;-- from the many, not the few;-- from the connected, not the isolated;-- from individuals and small groups, not larger organizations;196 Whitt and Schultze, Emergence Economics, at 278-284.197 Whitt and Schultze, Emergence Economics, at 267.198 Whitt, Adaptive Policymaking, at 494.199 VAN SCHEWICK, INTERNET ARCHITECTURE, at Part I.200 VAN SCHEWICK, INTERNET ARCHITECTURE, at Part III. 28
  • -- from the upstarts, not the established;-- from the decentralized, not the concentrated;-- from the flat, not the hierarchical; and-- from the autonomous, not the controlled.201Innovation is produced from those users motivated by many incentives, including profit,pride, and personal fulfillment. There is also a separate ―demand side‖ perspective toinnovation, based on extensive research showing that ―venturesome‖ consumers adoptingand using technology are crucial to maintaining economic prosperity.202The Internet provides the enabling background conditions for the creation anddissemination of innovation, and feedback loops: open, connected, decentralized,autonomous, upstarts, etc. Commentators have observed the strong correlation betweenrobust, ends-oriented innovation and the architecture of the Internet. 203 Lee McKnightnotes that ―the Internet works its magic through rapid development and diffusion ofinnovations.‖ The Internet Protocol acts as a ―bearer service‖—the general purposeplatform technology linking technologies, software, services, customers, firms, andmarkets—so that the Internet is ―an innovation engine that enables creation of aremarkable range of new products and services.‖204 Michael Katz believes that ―[t]hehourglass architecture allows innovations to take place at the application and transportlayers separately. This ability for independent innovation speeds the rate of innovationand increases the ability of entrepreneurs to take advantage of new opportunities.‖205In functional terms, one can envision the open interface to the Internet Protocol servingas the virtual gateway to its functionality, leaving all the applications and content andservices residing in the higher layers free to evolve in a vast number of ways. B. Spur to Economic GrowthEven a cursory review of contemporary economic statistics shows that the Internet hasbeen and continues to be a real boon for global economies. As one example, theMcKinsey study ―Internet Matters‖ shows that over the past five years the Internetaccounts for over one-fifth of GDP growth in mature countries.206 That same reportexplains that, for every job ―lost‖ to the Internet, some 2.6 new jobs are created. 207 TheNet also increases productivity by smaller businesses by at least ten percent, and enablesthem to export twice as much as before.208 Further for every ten percentage pointsincrease in broadband penetration (which of course enables high-speed Internet access),from 0.9 to 1.5 percent is added to per capita GDP growth, with similar increases in labor201 Whitt and Schultze, Emergence Economics, at 267-68.202 Whitt and Schultze, Emergence Economics, at 268.203 Whitt and Schultze, Emergence Economics, at 269.204 [citation]205 [citation]; Whitt, Horizontal Leap, at 630 n.160.206 McKinsey & Company, Internet Matters (2011), at __.207 McKinsey, Internet Matters, at __.208 McKinsey, Internet Matters, at __. 29
  • productivity over the following five years.209 Indeed, if it were a national economy, theInternet would rank in the world‘s top five.210Why should this be the case? Economic growth is measured according to how muchvalue a nation is perceived to produce for each citizen. Growth in economic output perperson is the most important measure and determinant of economic performance.Growth arises from the discovery of new recipes, and the transformation of things fromlow-value to high-value configurations. In shorthand, it is turning ordinary sand intosemiconductors.211 Paul Romer explains it this way: Economic growth occurs whenever people take resources and rearrange them in ways that are more valuable…. Human history teaches us, however, that economic growth springs from better recipes, not just from more cooking. New recipes generally produce fewer unpleasant side effects and generate more economic value per unit of raw material.212New Growth Theory reminds us that growth flows from within the system itself, and isdirectly and profoundly affected by conscious decisions made by economic actors. AsSusan Crawford puts it in the context of networked economies, ―[t]he economic growth-based . . . [story] is straightforward: the greatest possible diversity of new ideas that willsupport our country in the future will come from the online world, because of its specialaffordances of interactivity, interconnectivity, and unpredictable evolution.‖ C. Conduit for Free Flow of InformationHuman communications are critical to the very fabric of our civilization.Communications is all about broadly accessible connectivity. In the past connectivitywas enabled by the postal system, telegraph and telephony networks, book publishers,and other platforms. Today the broadband telecom sector provides increasinglyubiquitous core infrastructure that supports most aspects of our society. Innovation incommunications and the organization of information fosters educational, political, andsocial development, and reduces the transaction costs for conveying and exchangingideas.As we have seen, ideas are the fodder, the raw material, for innovation and economicgrowth and other beneficial Net effects. The free flow of information between andamong people can lead directly to a raft of business plans, physical technologies, andsocial technologies that compete vigorously and effectively in the marketplace. Aboveand beyond the economic impact of course, the free flow of information facilitates everyform of information, entertainment, and political discourse. The open dissemination ofand access to information through the Internet can play an especially critical role indeepening the forces of democracy.209 [citation to 2009 broadband report]210 BCG (2012).211 Whitt and Schultze, Emergence Economics, at 270.212 [citation]; Whitt and Schultze, Emergence Economics, at 270. 30
  • Virtually all countries benefit from new ideas created throughout the world. The openflow of information helps ensure that an idea engendered in one place can have a globalimpact. Because of the nonrivalry and increasing returns of ideas, expansion of theworld‘s stock of knowledge drives the underlying rate of growth in knowledge in everycountry that is exposed to it. Ideas equal human growth, and all its emergent benefits.213Ideas are understood to be a classic public good; we all can benefit from usefulinventions. An adaptive society must find and maintain the means to explore new ideas.Mechanisms generating new ideas, which in human society are expressed culturally andpolitically, are as important as access to abundant resources for economic growth andeconomic adaptation. Ideas also are the currency of cyberspace. The availability of aubiquitous communications and information platform in the form of the Internet enablesusers to promulgate and share ideas.214 As a result, the concept of ―More Good Ideas‖can serve as a proxy for maximizing the Internet‘s end-to-end benefits, and hence the freeflow of information.215 D. Tool for User Empowerment and Human FlourishingIt is difficult to estimate the full social value of the Internet.216 Studies show a positivecorrelation between Internet penetration, life satisfaction, overall happiness, and socialtrust.217 This should not be surprising. As we have already seen, the Internet enablespeople to become not just better consumers, but enabled entrepreneurs, innovators, andcitizens. In a highly networked economy, the benefits of innovation in physical andsocial technologies go far beyond traditional economic growth, and generate a diversityof other material and non-material benefits.Julie Cohen has examined the structural conditions necessary for human flourishing in anetworked information environment. She has shown that three elements are necessaryfor such flourishing: access to knowledge (networked information resources), operationaltransparency about networked processes, and what she calls ―semantic discontinuity,‖ ora flexible regulatory architecture that allows the play of everyday practice.218 TheInternet, and its design architecture, hold the potential and promise to enable all threeconditions.Barbara van Schewick has found that the Internet‘s architecture – and in particular thebroad version of the end-to-end principle – helps the network enhance individualfreedom, provide for improved democratic participation, and foster a more critical andself-reflective culture.219 User empowerment is ―a basic building block‖ of the Internet,213 Whitt, Adaptive Policymaking, at 543.214 Whitt, Adaptive Policymaking, at 549.215 Whitt, Broadband Policy, at 437-38.216 FRISCHMANN, INFRASTRUCTURE, at 336. The author then proceeds at some length to do so. Id.at 336-345.217 Penard and Suire, ―Does the Internet make people happy?‖ (2011).218 JULIE COHEN, CONFIGURING THE NETWORKED SELF (200 ).219 VAN SCHEWICK, INTERNET ARCHITECTURE, at __. 31
  • and should remain embedded in all mechanisms whenever possible.220 In fact, ―makingsure that the users are not constrained in what they can do is doing nothing more thanpreserving the core design tenet of the Internet.‖221 Not only do ―Net benefits‖ help fuelcore growth, they can have profound positive impacts on human life.222Yochai Benkler‘s The Wealth of Networks lays out the case for social production.223 Thewealth of networks is in their potential for widespread participation in making, sharing,and experiencing information. He argues that social production serves important values,including autonomy, democratic participation in the political sphere, construction ofculture, justice and human development, and community. He also points out that―advanced economies rely on non-market organizations for information production muchmore than they do in other sectors.‖ Benkler argues that ―the basic technologies ofinformation processing, storage, and communication have made nonproprietary modelsmore attractive and effective than was ever before possible.‖ Among other things, thisallows new ―patterns of social reciprocity, redistribution, and sharing . . . .‖ In asufficiently open and ubiquitous network, the benefits of the traditional firm can apply toall individuals.Some also see in the Internet the rise of ―networked individualism,‖ where individualscan project their views via a new social operating system.224 Doc Searls makes somerelated points in The Intention Economy. He describes the rise of vendor relationshipmanagement (VRM), due to the demand-side potential of ―free customers‖ withpersonalized demand.225 Susan Crawford takes an expansive view of the non-pecuniarybenefits of the Internet as an ideas and innovation platform that enables humaninteractivity. She sees the Net as allowing ―innovation in social relationships at a systemlevel,‖ which goes beyond seeing the ―content‖ layer of the Internet as the ―social layer.‖The existence of such a social layer promotes diversity, the democratization ofinformation (in creation, distribution, and access), and the decentralization of democracy.Interestingly, some have posited that the basic foundations of the peer production conceptare embedded in general purpose technologies (GPT) theory.226 Hofmokl claims thattraditional hierarchies are not necessary to coordinate the provision of goods in aninformation commons.227 Instead, what he calls the ―architecture of participation‖ isenabled by the Internet, making such cultural exchange possible on a massive scale, withmaximum flexibility and efficiency.228 He cites in particular both the end-to-endprinciple and the open TCP/IP network protocol, which were ―made to share, not to220 Clark, Sollins, Wrocklawski, and Braden, Tussles in Cyberspace, at __.221 Id.222 Whitt and Schultze, Emergence Economics, at 262-63.223 YOCHAI BENKLER, THE WEALTH OF NEWORKS (20 ).224 LEE RAINIE AND BARRY WELLMAN, NETWORKED – THE NEW SOCIAL OPERATINGSYSTEM (2012).225 DOC SEARLS, THE INTENTION ECONOMY (200 ).226 Hofmokl, The Internet commons, at 245.227 Hofmokl, The Internet commons, at 245.228 Hofmokl, The Internet commons, at 244. 32
  • exclude.‖229 This architecture enables a participatory culture where the passive, staticrole of the audience is replaced by the creation and exchange of new media content.230 F. Net ChallengesOf course, as a reflection and heightening of human behavior, the Internet has otherfacets as well. Its design architecture also allows for at least three types of ―Netchallenges.‖ These are essentially new ways to do bad things, new variations on socialills, and new forms of business models. Some of these challenges are unique to the Net,while others constitute societal problems that exist offline as well. We must endeavor tokeep these three sets of challenges separate and distinct from one another.First, the Internet allows various ―bad actors‖ and ―bad actions.‖ As a faithful reflectionof humanity, the Net has its share of unseemly conduct and criminality. These bad actionsinclude lower-layer network-based ills, like spam and malware and DOS attacks, andupper layer ills, like child pornography and piracy of content. Zittrain says that the Net‘sgenerative nature, its openness and unpredictability, are precisely the qualities that allowworms, viruses, and malware to suffuse its networks.231 As Vint Cerf points out, ―Everylayer of the Internet‘s architecture is theoretically accessible to users and, inconsequences, users (and abusers) can exploit vulnerabilities in any of the layers.‖ 232 Asjust one example, interoperability of networks, ordinarily a plus, also increases risks tocybersecurity.233The point is not to ignore these very real policy challenges as if they will be miraculouslysolved by the Net‘s utopian design. These problems are very real, and they deserve to betaken seriously. Nor should we deal with them, however, in ways that ignore possiblecollateral damage to the rest of the Internet. Rather, we should want to tackle thechallenges with the right policy implements, so as not to do violence to the Internetdesign principles, and all the attendant Net benefits. Zittrain agrees that we need astrategy that ―blunts the worse aspects of today‘s popular generative Internet and PCwithout killing those platforms‘ openness to innovation.‖234Second, the Internet has introduced or heightened a variety of social ills, often rooted inthe psychology of human behavior. For example, Sherry Turkle worries that technologylike the Internet and networked devices substitute for real human connections.235 EvgenyMorozov believes that, contrary to the flawed assumptions of ―cyber-utopianism‖ aboutthe emancipatory nature of online communities, the Net too often empowers thepolitically strong and disempowers the politically weak.236 William Davidow warns that229 Hofmokl, The Internet commons, at 244.230 Hofmokl, The Internet commons, at 244.231 ZITTRAIN, THE FUTURE OF THE INTERNET, at 45-49.232 Vint Cerf, Internet Governance: A Centroid of Multistakeholder Interests, MIND, at 74.233 INTEROP, at 150.234 JONATHAN ZITTRAIN, THE FUTURE OF THE INTERNET (20 ), at 150.235 SHERRY TURKLE, ALONE TOGETHER: WHY WE EXPECT MORE FROM TECHNOLOGY (20__). ―We seem determined to give human qualities to objects and content to treat each other as things.‖236 EVGENY MOROZOV, THE NET DELUSION: THE DARK SIDE OF INTERNET FREEDOM 33
  • the Internet creates an overconnected environment that becomes unpredictable, accident-prone, and subject to contagions; in his view, the financial crisis actually was acceleratedand amplified by the Net.237 It is not obvious that these types of concerns warrant directgovernment or other intervention, but it is useful to separate them out from the other―parade of horribles‖ about the pernicious impact of the Internet.Third, the Internet is a highly disruptive platform that already has undermined entire linesof business, even as new ones are being engendered. Economic and social value isshifting away from incumbent business models, with innovation and growth increasinglyarising from users at the edges of the Net. For example, low barriers to entry plus digitaltechnology plus pervasive Net access equals many different ways for traditional goodsand services to be found, produced, obtained, and consumed. In such circumstances, it isnot surprising that incumbents will resist technological change, such as that wrought bythe Internet.238 However, such seismic shifts should not be assumed to be examples ofbad actions by bad actors, and thus should not automatically be counted as policychallenges deserving a swift political response.We can turn briefly to online privacy as one example of an issue that many have arguedmerits some concerted political consideration. As it turns out, the Internet pioneers hadbeen debating the role of privacy in the network from the beginning. 239 Braman hasdiscovered that early decisions by those who designed the Net ―created a situation thatenabled the data mining‖ of concern to many.240 However, the Net‘s designers alsorecognized that ―there is a difference between reaching a consensus on general principles,such as the importance of protecting privacy, and reaching a consensus on the actualtechniques to be used.‖241 She also found ―a tension between establishing standards forprivacy protection and the need to minimize constraints on further experimentation andinnovation.‖242 Braman sums up the views of the Net design pioneers: ―They recognized that privacy is a multi-dimensional problem, that it arises at every stage of networking, and that it has to be revisited every time there is a change in technologies. They understood that the same user may hold conflicting views on privacy, depending on which activity is being undertaken and the role held. And they knew that the introduction of one technique for protecting privacy could open up other possible means of invading privacy.‖243When discussing the public policy world we have an unfortunate tendency to mix up thediagnosis and the remedy, the means and the ends. Policymakers need to have the proper(200_).237 WILLIAM H. DAVIDOW, OVERCONNECTED: THE PROMISE AND THREAT OF THEINYERNET (200_ ). Of course this can be seen as the flip side to the general benefits of networkinterconnection and interoperability.238 Whitt and Schultze, Emergence Economics, at 296.239 Sandra Braman, Privacy for Networked Computing, 1969-1979, at 1.240 Braman, Privacy, at 2.241 Braman, Privacy, at 3.242 Braman, Privacy, at 3.243 Braman, Privacy, at 17. 34
  • tools to sort out whether a particular policy situation constitutes a bad act that requires atailored remedy, a magnified social ill that needs to be better understood, or a newbusiness model that should be allowed to challenge the status quo. For appropriateguidance we must look to what the technologists, the scientists, and the economists sayabout the Internet – how its unique design architecture renders it in macro terms as aGPT, a CAS, and a CPR. In short, we need a grounded public policy framework for theInternet.V. Deference Ahead: Respecting the Internet’s Functional IntegrityMy previous papers laid out the case for general restraint, flexibility, and creativity bypolicymakers when considering potential regulation of Internet-based activities. Inessence, the sundry human activities that occur via the upper layers functions ofapplications, content, and services, constitute dynamic, evolutionary processes unseen byprevious generations. For the most part policymakers should be cautious about intrudingunnecessarily into those processes, but instead allow them to play out in an enabled butnot overly-prescriptive environment.Here we are focusing on the functions that tend to occur in the middle layers, thosesoftware-derived protocols and standards that comprise the inner workings of theInternet. In these ―logical layers,‖ the prior note of caution and adaptability istransformed into a plea for due deference to the role and ongoing work of the designers.As Searls and Weinberger point out, by tampering with the Internet‘s core design aspects,policymakers ―are actually driving down its value.‖244 This Part begins to fill in theremainder of the larger public policy framework by building the case for such deferenceto what I term the ―integrity‖ of non-governmental processes and entities in the face ofNet challenges. In Part VI we will explore how the impact (positive or negative) on theInternet can vary greatly, depending on the institutions and organizations employed inaddressing the functional target. We need all three dimensions of targets, rules, andplayers fitting well together. A. Revisiting the Layers Model 1. Matching functions to layersA ―layered‖ approach can serve as a mental frame that provides useful for policymakersand others to think about the Internet and the myriad activities that occur within andbetween its different functional components. As an overarching conceptual tool, thismodular model would, for example, replace the existing ―silos‖ approach under theCommunications Act of 1934, which treats regulated entities and their service offeringsin the context of legacy industries.245The layered model mirrors the actual functional architecture of the Internet, and the244 Searls and Weinberger, ―World of Ends,‖ at __. In part this desire to tamper stems from the faultyassumption that the value of the Net‘s contents equals its actual value. Id.245 Whitt, Adaptive Policymaking, at 563-567. 35
  • reality of the market we have today. A version that well combines simplicity andprecision is a four-layered approach: physical, logical, applications, andcontent/interactivity. I suggested that conceptual model back in 2004, and for manypurposes it is still a useful one. Most fundamentally, the first principle of the modularapproach is that there is network, and there is ―stuff‖ that rides on top of the network.246This paper delves more deeply into the different functions of the Net, and thus willrequire a more exacting layers model to guide our thinking. I will utilize a modifiedversion of the seven-layer OSI stack. To be clear, these seven layers are not uniformlyrecognized by technologists, nor do the functions I describe always correspond neatly tothe layer in question. Nonetheless the OSI stack provides a useful jumping-off point.Here are the modified seven layers, with some functional examples:Layer 7 -- Content (such as video)Layer 6 – Applications (such as email)Layer 5 – Session (such as HTTP)Layer 4 – Transport (such as TCP)Layer 3 – Network (such as IP)Layer 2 – Data Link (such as DOCSIS)Layer 1 – Physical (such as fiber)Again, one can quibble as an engineering matter over which particular network protocolfunctions at which of the particular layers, but the fundamentals are sound enough.247Moreover, the foundation of the seven-layers framework is the reality of the market wehave today, rather than some artificial construct found on a chalkboard. Modularitycomports with the technology-based ecosystem at the heart of the Internet. Theinteractions of market players obviously are shaped and heavily influenced by the designof the architectural structure within which they exist. A seven-layers model helpshighlight the technical and economic interdependence of different Net-based activities. 2. The Middle Layers: Defining the logical layers functionsThe Internet‘s design attributes run all through it; indeed, they essentially define the Net.Yet they originate in particular in one specific set of layers in the middle – what could becalled the ―logical layers.‖ For purposes of this paper, the focus will be on the basic246 Mirroring our earlier discussion of Susan Crawford‘s ―social relationships‖ level, Frischmann would adda fifth ―social layer‖ to this framing, comprised of social networks and activities. FRISCHMANN,INFRASTRUCTURE, AT 319. While I sympathize with the interest in highlighting this largely beneficialaspect of Internet-based life, I resist the impulse to turn it into a wholly separate layer in the model. To betrue to the functional analysis we should endeavor to focus on the specific operational elements of thenetwork, rather than the emergent phenomena that is enabled as a result.247 In a private conversation, Vint Cerf points out that the modular nature of the Internet actually masks afractal element, in which each layer contains sub-layers. HTTP can be considered a form of transportprotocol operating over TCP, while IP can be layered over other network services that themselves dorouting and ride on top of link layers. These fractal elements do not invalidate the layers design attributebut emphasize that these attributes are based on abstractions. 36
  • Internet addressing and routing functions that operate in these levels of the network, andon government actions that threaten to disrupt their workings. Using layers as theframing mechanism for these specific functions should not detract from the largerarguments here. The point is not to be wedded too tightly to modularity, or to anyparticular modularity model, but instead to focus on the actual functions themselves inthe network, and how they have been derived.As mentioned previously, I have adopted for present purposes a seven-layer modifiedOSI stack. It makes most sense to further divide those seven layers into three discretegroupings of protocols: the Upper Layers, the Middle Layers, and the Lower Layers.Roughly speaking, the logical functions of the network reside in the ―Middle Layers,‖with representative examples of the software protocols that reside there:-- Layer 5: session (HTTP, DNS)-- Layer 4: transport (TCP)-- Layer 3: network (IP)For the most part the logical layers functions constitute the locii of the basic designprinciples and attributes. All four of the design attributes run through, and help define,these layers. There could be no end-to-end functionality, no interconnected networks,and of course no bearer protocols, without the logical layers.By contrast, the Lower Layers functions are the world of telecom networks andstandards. Layer 2 defines the various communications standards, protocols, andinterfaces, such as Ethernet, WiFi, DSL, and DOCSIS. Many of these relate to the last-mile physical networks used to access the Internet. Layer 1 is the physical infrastructureitself. There may be legitimate concerns about how those Lower Layers environmentsmay be affected by government policy,248 particularly to the extent that control over theseLower Layers functions can affect the latitude and functionality of the higher layers.However, the actual operation of the physical networks is not equivalent to the Internet‘sdesign attributes and operations, and thus is not germane to our analysis here.Upper Layers functions reside in Layers 6 and 7. Layer 6 is the world of end userapplications, where people can fashion and attach software to the network for others toshare and utilize, while Layer 7 is all the content and services generated by theseinteractions. This is the point where the network essentially turns inside out, and thesoftware – from browsers to search engines to email clients -- is now exposed andaccessible to all. Again, while I analyze these Upper Layers activities elsewhere,249 theseparticular functions are not relevant to the thrust of this paper.Others come with a slightly different division of network functions between the layers.250248 See Whitt, Broadband Policy.249 See Whitt and Schultze, Emergence Economics; Whitt, Adaptive Policymaking.250 For example, Scott Jordan divides up the OSI stack into two sub-groupings, comprised of Layers 1-3(functionality within the local access networks) and Layers 4-7 (functionality provided in end pointsoutside the access networks). Scott Jordan, ―Implications of Internet Architecture on Net Neutrality,‖ 37
  • Kevin Werbach talks about the logical layer as that place in the network that ensures thatthe right bits get to the right place, the point of demarcation between software that talksto the network and software that talks to users.251 I agree, and see the logical layers asthose Middle Layers functions – Layers 3, 4, and 5 -- which face inward, toward the otherparts of the network. In most cases this is the combination of IP/TCP/HTTP protocolsthat makes up the Web. Werbach insists that this ―glue‖ that holds the Internet/Webtogether constitutes ―the most crucial points in the communications stack for purposes ofpublic policy.‖252 B. Introducing The Internet Policy Principle 1. From layers model to layers principleThe chief idea behind the layers model is to allow for a more concrete analysis of policyissues, by placing each function at a proper layer of the network, and providing a correctfocus on the relevant operation of the Internet. The layers metaphor is intended toprovide a flexible conceptual framework, a visual map, to guide decisionmaking.Professor Lawrence Solum was the first to take the layers model and apply it to concreteissues in public policy. He fashioned the concept of the ―layers principle,‖ whichamounts to the general exhortation to ―respect the integrity of the layers.‖253 Solum‘slayers principle can be defined by the following statement: “Public Internet regulatorsshould not adopt legal regulations of the Internet (including statutes, regulations,common law rules, or interpretations of any of these) that violate the integrity of the[layered nature of Internet architecture], absent a compelling regulatory interest andconsideration of layer-respecting alternatives.”254Professor Solum describes two interrelated corollaries that support his layers principle.The corollary of layers separation states that regulation should not violate or compromisethe separation between the Internet‘s layers. This means that one network layer shouldnot differentiate the handling of data on the basis of information available only at anotherlayer, absent a compelling regulatory interest. Solum observes that the Net‘stransparency, a product of layers separation, is the key feature that enables low-costinnovation.255 Thus, ―the fact that layer violating regulations inherently damage thetransparency of the Internet, combined with the fact that Internet transparency lowers thebarriers to innovation, provides compelling support for the principle of layerseparation.‖256ACM Transactions on Internet Technology, Vol. 9, No. 2, 2009, at 5:16. Jordan‘s dichotomy demonstratesthe crucial bridging function provided by interfacing with the Internet Protocol at Layer 3, at least whenanalyzing network neutrality issues.251 Werbach, Breaking the Ice, at __.252 Werbach, Breaking the Ice, at __. Werbach says that policymakers should police the logical layer as acompetitive boundary, but not delve deeply into the logical layer and directly organize markets. Id.253 Solum, at __.254 Solum, at __.255 Solum, at 6.256 Solum, at 26. 38
  • The corollary of minimizing layer crossing states that, if compelling regulatory interestsrequire a layer-crossing regulation, that regulation should minimize ―the distancebetween the layer at which the law aims to produce an effect and the layer directlytargeted by legal regulation.‖ Here Solum utilizes ―the familiar idea of fit between theends of regulation and the means employed to achieve those ends.‖257 He observes that―the fact that layer-crossing regulations result in an inherent mismatch between the endssuch regulations seek to promote and the means employed implies that layer-crossingregulations suffer from problems of overbreadth and underinclusion. . . .‖258 To avoidthese problems, policymakers therefore should minimize layer-crossing regulations.259Identifying both the layer of the problem conduct, and the layer where the proposedregulation would operate, correctly focuses attention on the relevant operational elementsof the Internet. In essence, the legal regulation can only be as effective as is permitted bythe architecture of the Internet. And, in turn, the nature and limitations of the legalregulation will be determined by the nature of the code being implemented.260Regulations that fail to respect the integrity of the layers preclude innocent users of theInternet, cannot achieve their regulatory goals, and threaten the Net‘s transparency (andconsequent ability to serve as the platform for numerous beneficial activities).261 In brief,given the potentially dire consequences of compelling changes in the Net‘s layereddesign, policymakers should conform their proposed policy solutions to the network, andnot the other way around. 2. Expanding the layers principle to other Internet functionsWith all due respect to Solum (and my earlier Horizontal Leap paper), the layersprinciple is a helpful but inadequate framing mechanism for policymakers. While it hasmuch to recommend it as an organizing abstraction, the modular view of the Internetcomes up short to the reality of its actual design and operation. The Internet is morecomplex than just its layered architecture. In many respects the layers are the supportingstructure, the scaffolding, through which the other design attributes can be expressed.Nonetheless the layers by themselves do not capture the full essence of the Internet.Fixing solely on how the layers interrelate leaves out important elements of the way thattraffic flows, networks interoperate, and protocols interact.262 How the design elementsfit together and operate cohesively can be seen as creating a ―functional integrity‖ thatdescribes the user‘s overall experience of the Internet.The collective view of the Internet also relies on more than just the layering principle.Looking to all four design attributes is more faithful to the richness and complexity ofInternet architecture, and its macro-level incarnations as a GPT, a CAS, and a CPR. Nor257 Solum, at 5.258 Solum, at __.259 Whitt, Horizontal Leap, at 626.260 Solum, at 28-29.261 Solum, at 6.262 Interestingly, Solum‘s layers principle is concerned primarily with layers-crossing regulations; it doesnot appear to preclude direct regulation of the logical layers, a key concern raised by this paper. 39
  • can the emergent Net benefits and challenges be fully described and understood withoutreference to the other design attributes. Indeed, one can imagine that tweaking even oneof the architectural elements could result in a very different Internet. As Barwolff pointsout, ―in Internet architecture it is futile to insist on the strict universality of any oneprinciple in isolation, without considering the other principles with which it combines toa system of principles that can only be applied with creative judgment to a givenpurpose.‖263Solum himself seemed to understand the need to think beyond the layered nature of theInternet. He notes that the layers principle reconceptualizes the end-to-end principle,yielding a richer and more accurate model of the Internet‘s fundamental architecture.264Indeed, the end-to-end principle too does not fully and accurately capture thefundamental relationship between Internet architecture and sound or optimalregulation.265 Instead, he sees the ―normative content‖ of the layers principle as asuperset of the normative content of the end-to-end principle.266 He also bemoans thefact that most Internet regulation misses an analysis of the implications of TCP/IP and itscentral and essential role in the design and the functioning of the Internet.267 Moreover,Solum also points out that the evolution over time of the Internet‘s architecture did notbegin or end with layering. He observes that disparate networks interconnect via packetswitching (late 1960s), TCP/IP is the common protocol that ties them together (mid-1970s), layering is the way to separate out functions (late 1970s), and end-to-end is thecontrol structure at the end points (early 1980s).268So if we accept the premise that our framing device should extend beyond layers, toencompass all four of the Net‘s chief design attributes, where does that take us? PerhapsSolum again can point us in the right direction. His layers principle, with its emphasis onhow potential regulation can adversely affect the different layers, can be extended to theend-to-end principle, interconnectivity, and IP. For example, Solum points out that thegreater the number of layers crossed, the worse the regulation; the fewer the layerscrossed, the better the regulation.269 We can utilize a similar approach with regard to theother three design attributes. This translates into the principle that the greater the degreethat each design attribute is violated, the worse the regulation. Moreover, the greaternumber of the four attributes are violated, the more harmful the regulation. So, relativeharm can be tied to both the degree and the scope of the architectural violations. 3. Sketching out the first dimension of an Internet policy frameworkTaking into account the four fundamental design attributes of the Internet allows us tomove beyond a fixation on just one element – modularity – to encompass more of the263 Barwolff, at 134.264 Solum, at 7.265 Solum, at 7.266 Solum, at 7. Solum states that the e2e concept emerged from the layers model as an articulation andabstraction of implicit ideas inherent in the layers model. Solum, at 18.267 Solum, at 7.268 Solum, at 25.269 Solum, at 31. 40
  • richness and complexity of the Net‘s functionality. This new approach to public policyrightly should be seen as the architectural basis – the first dimension – of a new Internetpolicy framework.This proposed Internet policy framework would come into play in situations where apolicymaker or other similarly-situated entity is considering imposing specificobligations or restrictions on activities that utilize, comprise, or support the Internet. Aswith the layers principle, precision is what matters; the end goal must be closely alignedto the means employed. Where Middle Layer functions are involved in some way, thepolicymaker can be said to be potentially affecting the functional integrity of one or moreof the four design attributes. Such ill-conceived policy mandates could be destructive ofthe Net‘s generative qualities – its functional integrity.Using the modularity attribute as an example, violations of the proposed principleordinarily would involve situations where regulation is directed at a lower protocol layerin order to address problems that originate at an upper layer, particularly the contentlayer. Examples include: (1) the music distribution industry seeking to target the TCP/IPlayers to combat peer-to-peer networking; (2) policymakers asserting control overInternet content; and (3) blocking or filtering requirements.270We can also appropriate Solum‘s fit thesis, which focuses on ―the familiar idea of fitbetween the ends of regulation and the means employed to achieve those ends.‖ 271 Withhis layers principle, ―the fact that layer-crossing regulations result in an inherentmismatch between the ends such regulations seek to promote and the means employedimplies that layer-crossing regulations suffer from problems of overbreadth andunderinclusion. . . .‖272 Such technology strictures often represent a poor fit to theperceived market challenge, threatening to be under-inclusive (and thus not particularlyeffective) and/or over-inclusive (and thus imposing collateral damage on innocentactivities).273 Importantly, whether or not a policymaker cares about harming substantialinnocent uses through an overly broad proposed course of action (overinclusive), thatsame policymaker should care about the lack of effectiveness of that same proposal(underinclusive).This same concept of overbreadth/underinclusion can be applied as well to the other threedesign attributes. Generally speaking, the more narrowly the regulation focuses on theactual Internet design function it is attempting to control, the less it will impair otherfunctions, reduce transparency, or cause substantial ―innocent use‖ problems. In thelayers model, for example, the severity of the design function violation can be consideredgreater when the regulation is attempted at a lower layer, in order to address problems ata higher layer.274270 Whitt, Horizontal Leap, at __.271 Solum, at 5.272 Solum, at __.273 Solum at 5, 52-53.274 Whitt, Horizontal Leap, at 637-645. 41
  • From the foregoing discussion, one reasonably can conclude that there should be a strongpresumption against the design of function-violating regulations. As with the layersmodel, this is especially true where such a regulation affects or has the potential to affecta large number of users (such as the nation‘s largest ISPs or backbone operators, an entirenation or nations, or most available TCP ports).275 Examples of such ill-conceived policystrictures would include cross-layer regulation (turning the Internet upside down),injection of unneeded central functions (turning the Internet inside out), interference withvoluntary network interconnections (turning Internet connectivity from open to closed),and skewed packet carriage (turning the Internet‘s bearer protocols from indifferent todifferential). We will address several real-life examples in the last section of the paper.Thus, as a general matter, policymakers should refrain from addressing a policy concern: occurring at one particular layer of the Internet by affecting a separate, unrelated layer; occurring at one particular end of the Internet by introducing a separate, unrelated function at the core of the network; occurring with one particular network of the Internet by imposing on the connectivity of other networks; or occurring with one particular service offering by transforming the bearer protocols into prescriptive, differential protocols.In short, policymakers should avoid adopting top-down, or inside-out, mandates thatviolate the functional integrity of the Net. 3. Meeting potential objectionsIn the next Part I will expand this functional analysis into an actual three-dimensionalpolicy framework by yoking it to possible institutional and organizational tools. Herethough I want to address head-on some potential objections to relying on the fourdesignated design attributes as the technical basis for a guiding Internet policyframework.First, some may argue that this new framing of policy issues amounts to a form ofInternet exceptionalism, the notion that we must treat the Internet differently than othertechnology platforms. And given both its exceptional roots and its extraordinary impacton society, I plead guilty – with a caveat. To treat the Internet differently is not the samething as denying that there should be any regulation of its activities. We are, or shouldbe, well past that stage. As Mueller has observed, the Internet battles of the near past toooften devolved into disputes between what he calls ―cyber-libertarians‖ (those whoadhere to ―naïve technological determinism‖) and ―cyber-conservatives‖ (those ―realists‖275 Whitt, Horizontal Leap, at 645. 42
  • who believe, among other things, that the Net is inevitably constrained and dominated bynation-states).276 While admittedly these are two extremes, both unrealistic, the point isthat often there was little debating room available in between.By contrast, I prefer to think that ―respecting the functional integrity of the Internet‖ is anexample of adaptive policymaking, a means for taking full account of all salientcharacteristics of the technology and market being considered. Context is all. After all,most conscientious legislators would not consider regulating roads or electric grids or anyother platform/system/resource without carefully considering how it functions, and why.Why should we approach the Internet with any less attention to what matters? If labelsare necessary, perhaps that view makes me a ―cyber-contextualist.‖Second, some might object that the Internet does not always, or even mostly, operate asthe four design attributes suggests. For example, the growing inclusion of ―cross-layerdesign‖ in wireless networks may raise questions about the continuing legitimacy of theclassic layering model.277 Christopher Yoo makes a related point when he argues that theInternet has changed tremendously from its early years, to a technology platform that nolonger obeys many of the original design attributes.278The Internet at any moment in time is not simply the sum of four basic design features;numerous additional software elements have been added over the years that in some waysmute their collective power. In any vast engineering enterprise, there are always tradeoffsand caveats and exceptions. Here we are not seeking absolutes or certitudes, but only(but importantly) a presumption against unnecessary government intrusion, and acontinuum of possible concerns.More to the point, many of the so-called exceptions to the Net‘s design principles, suchas the prevalence of content delivery networks, the rise of cloud computing, and thegrowing ―Internet of Things,‖ constitute the addition of network overlays and underlaysto the logical layers. The Middle Layers design attributes continue to operate as before,even if modified from below in Lower Layers functions by new types of access andtransport networks, and from above in Upper Layers functions by new business models.Cross-layer designs too constitute the ―exception that proves the rule,‖ as they normallywould exploit rather than replace the existing network modularity. Further, Yoo‘s largerargument is that the government largely should stay out of the ever-evolving Internetspace, a conclusion not inconsistent with the thrust of this paper.Third, some may argue that a proposed reliance on analyzing the impact on designattributes seeks to freeze the Internet in place, resistant to any further useful changes.276 MUELLER, NETWORKS AND STATES, at __. Winner defines naïve technological determinism asthe idea that ―technology develops as the sole result of an internal dynamic and then, unmediated by anyother influence, molds society to fit its pattern.‖ Langdon Winner, ―Do Artifacts Have Politics?‖ at 19-20.277 Vineet Srivastava and Mehul Motani, ―Cross-Layer Design: A Survey of the Road Ahead,‖ IEEECommunications Magazine, December 2005.278 See, e.g., Christopher Yoo, ―Network Neutrality or Internet Innovation?‖ Regulation (Spring 2010) 22(recent emergence of different interconnection regimes and network topologies, such as secondary peering,multihoming, server farms, and CDNs, differ substantially from traditional Internet architecture). 43
  • Some of these critics contend that today‘s Net requires substantial improvement becauseit ―doesn‘t respond linearly to increased demand or deployment of bandwidth; is highlyvulnerable to attack, abuse, and misuse; demands too much time and skill to administer;is not scalable; and is effectively impossible to upgrade without exceptional effort.‖279Obviously technical experts can contest those conclusions. The larger point is that thebroad deference suggested here would allow ―natural‖ evolution of the Internet within itsnormal standards bodies and processes, free from unnecessary and even harmful politicalprescriptions. The intention is not, as Bennett puts it, to ―freeze these descriptions intoregulation.‖280 The Internet‘s design features obviously are not laws. But they have andshould come organically from the bottom-up, through decades of rough consensusachieved by expert engineering in open, transparent extra-governmental processes. Andthey are directly responsible for the rich benefits for users we see today.A similar complaint is that identifying and giving special status to the Net‘s logical layersfunctions somehow amounts to a tech mandate that dictates how businesses using the Netshould operate. However, the direction here is for policymakers not to interfere with thecurrent operation of the Net, and the way its design features are articulated. It is not toimpose the design attributes as strict requirements on the Net‘s various stakeholders,including its countless users. The Middle Layers functions may well have a normativepurpose in the minds of many Internet engineers in terms of how the Internet is meant tooperate, but that philosophy need not extend to other stakeholders. Indeed, individualnetworks and other entities are free to join, or not, the larger community of the Internet.Moreover, much of the rationale for keeping the Internet ―open‖ is that the variousenterprises operating on top of it don‘t necessarily have to be. There is a basic difference(aside from a single letter) between being ―on‖ the Net, and being ―of‖ the Net. For themost part, people talk about the Internet when they really mean to be describing variousactivities that take place using the Net as a platform technology. Here, we are concernedinstead with those attributes that are ―of‖ the Internet.Our public policy regime should want to protect and promote the Internet‘s realnormative value. But that does not necessarily mean that the Internet cannot improveover time, evolving to meet new technical and societal challenges. We should not wantto foreclose those possibilities. Nor, as the next section should make clear, does it meanthat we should reflexively turn to the government as the agent that will actually safeguardthe Net‘s considerable worth to society. Deference to the voluntary standards processesand players of the Internet means allowing its users – network operators and end usersalike -- to decide what work best for them. In effect this is a version of the ―tinker don‘ttamper‖ approach of my previous papers, by injecting well-considered technical optionsinto the marketplace that did not exist before. Much like the Net‘s ―Middle Layers‖originated and flourished beyond direct government and market control, we shouldmaintain that same hands-off approach for its future evolution.279 Bennett, ―Designed for Change,‖ at 6-7.280 Bennett, ―Designed for Change,‖ at 39. 44
  • VI. Two Other Dimensions: Matching Code to Rules and PlayersJust as important as what you do is how you do it. In assessing how to approach aperceived policy concern involving the Internet, there is more than just figuring outwhich functional aspect of the network to target. The policymaker also must determinethe institutional tool to utilize, and the organizational entity to carry it out. The Internetpolicy principle introduced in the previous section – respect the functional integrity of theInternet -- goes to the operational analysis necessary to minimize harm to the four designattributes of the Internet. The impact (positive or negative) on the Internet can varygreatly, however, depending on whether and how the proposed government action affectsits modular, end-to-end, interconnected, and agnostic architecture.This section will examine the two remaining dimensions -- institutions and organizations-- necessary to complete a viable Internet policy framework. These elements constitutewhat is used to carry out any functional solution to an Internet-based policy challenge.What increasingly has come to be labeled ―governance,‖ the tool for enabling a certainordering in social relations, is just a blend of different institutions and organizations.Most observers assume that law and regulation are the only tools that a policymaker canwield to rein in unwelcome activities. But the law as created and implemented can be ablunt instrument. In Nissenbaum‘s apt phrase, ―the law plugs holes that technologyleaves open;‖ it ―defines away alternatives.‖281 Fortunately there is far more than thelaw, and law-defining entities. A wide array of choices is available in both areas: theinstitutional overlay (or, which tool should we use, from laws to regulations to standardsto best practices to norms), and the organizational overlay (or, which entity should weuse, from the legislature to government agencies to multistakeholder groups). A. The Institutional Implements: Rules of the GameThere is a wide and often underappreciated range of societal instruments that can beutilized in support of policymaking. Much of human interaction and activity is structuredin terms of overt or implicit rules. Institutions – the rules of the economic game -- createthe incentive structure of commercial relationships. Markets require conscious politicaleffort to foster the trust-generating institutions necessary to make them function at all.282For years the school of New Institutional Economics (NIE) has been wrestling withquestions about the appropriate institutions for a market economy. Neoclassicaleconomics generally was dismissive of institutions, and has lacked empirical data abouttheir role. The fundamental tenets of NIE are that institutions matter, and can beanalyzed by tools of economic theory. In particular, institutions, from law and contractsto norms and behavior codes, help reduce information uncertainty and transaction costs.Different institutional arrangements also lead to different trajectories, and different281 Nissenbaum, From Preemption to Circumvention, at 1385.282 Whitt, Adaptive Policymaking, at 514-526. 45
  • combinations of static and dynamic performance characteristics -- including differentialprices, diversity of services, rate of new service introduction, and ubiquity of access toservices and content. A gamut of institutional choices differs by degree of coercion,flexibility, accountability, and formality. For example, social control often can beachieved through more informal, decentralized systems of consensus and cooperation,such as norms, rather than through laws.283 B. The Organizational Entities: Players of the GameIn addition to institutions (what economists consider the rules of the political/economicgame), we also have the various entities that actually play the game. Organizations aregroups of individuals bound together by a common purpose to achieve certain agendas.They comprise a special kind of institution, with additional features including criteria toestablish their boundaries, principles of sovereignty, and chains of command. In additionto government actors, they include political, social, and educational bodies, likecorporations, political parties, law firms, trade unions, and universities. Much likeinstitutions, organizations run the gamut from formal to informal, accountable to non-accountable, fixed to flexible.284Typical political bodies in a system like the United States would include the executivebranch, the legislative branch, the judicial branch, and the ―fourth branch‖ of independentagencies.285 Corporations also are part of this organizational ecosystem,286 along withmany other types of entities like non-governmental organizations (NGOs).Each organization is its own complex adaptive system, which among other things meanswe should look beneath the surface to recognize the actions of the disparate playerswithin. The treatment of an organization as a social actor should not ignore the potentialconflict within the organization. Moreover, organizational perspectives dictate how onelooks at a policy issue. Whether you are a corporate CEO, a public interest advocate, apolitical appointee chosen to run a government agency, or a career bureaucrat in thatsame agency, what you see depends on where you stand. C. The Basics of Multistakeholder GovernanceThese following three sections on the right types of institutions and organizationalstructures for the Internet easily could warrant a much more extensive and fulsometreatment. For present purposes, I will provide some basic observations and then sketchout a few possible ways forward. 1. The rise of multistakeholder modelsMultistakeholderism is all the rage in some quarters these days. Many industry sectors283 Whitt, Adaptive Policymaking, at 515.284 Whitt, Adaptive Policymaking, at 526-27.285 Whitt, Adaptive Policymaking, at 527-28.286 Whitt, Adaptive Policymaking, at 528-530. 46
  • have concluded that more traditional forms of regulation should be replaced by a systemthat includes representatives from the affected industry, and other ―stakeholders‖ whostand to gain or lose materially based on the outcome. While the literature is large andgrowing quickly, some have called for a more thorough research agenda on thesemultistakeholder organizations, or MSOs.287Forms of multistakeholder models (MSMs) include self-regulation, where the industryitself takes the lead in devising codes of conduct and best practices meant to govern theway business is done, and co-regulation, where the government is seen as having more orless an equal stake in the process and substantive outcome of the deliberations. In fact, inmany cases, MSMs are initiated or promoted by the government. Marc Berejka for onebelieves that policymakers are just another group of stakeholders that can help to coax orguide along the development of new best practices and norms for the Internet.288The major rationale for adopting an MSM approach is that institutional change is muchslower than technological or economic change, and softer power institutions like MSOsare more decentralized and adaptable.289 Scheuerman has conducted a detailed analysis of―the social acceleration of time,‖290 and shown how the Internet has only exacerbated theproblem for policymakers. Relying on more collaborative administrative regimes withsupportive stakeholders can increase creativity, improve implementation, and heightendemocratic participation.On the other hand, some argue that such projects will lack legitimacy because thestakeholders‘ self-interest undermines collaborative endeavors, compared to a rule-bound, deterrence-based system. The challenge is to balance flexibility and adaptabilityof soft power solutions, with legitimacy and accountability (by both policymakers andeconomic actors), and the potential for enforceability, of hard power solutions. Onetakeaway is that social control often can be achieved through more informal,decentralized systems of consensus and cooperation, rather than classic command-and-control measures.291MSOs do not come to power or accountability easily; usually they must build theirlegitimacy a posteriori, rather than enjoying it a priori.292 A chief concern is to promotebuilding trust among stakeholders, either through formal or informal institutions.293Indeed, ―for any group, capture is what the other groups are guilty of.‖294 That may beone reason why, at its core, the MSM is process, more than ends, driven.295287 Waz and Weiser, Internet Governance, at __.288 Marc Berejka, A Case for Government Promoted Multi-Stakeholderism, Journal on Telecommunicationsand High Technology Law, Vol. 10, Winter 2012: 1.289 Whitt, Adaptive Policymaking, at 524.290 Whitt, Adaptive Policymaking, at 525.291 Whitt, Adaptive Policymaking, at 526.292 De La Chapelle, Multistakeholder Governance, at 24.293 Berjka, Multi-Stakeholderism, at __.294 De la Chapelle, at 23.295 MUELLER, NETWORKS AND STATES (2010), at 264. 47
  • Drake identifies different models of MSM participation along a continuum, from Type 1(being the weakest) to Type 4 (being the strongest). 296 He highlights five problems withmany MSM groups: scope of participation is too narrow, developing world needs moreoutreach, an often yawning gap exists between nominal and effective participation,processes are ―inevitably configured by asymmetries among agents in terms of wealth,power, access to information, connections, and influence,‖ and government commitmentto MSMs appears thin.297 Interestingly, aside from the OECD, there are no signs ofmovement toward increased multistakeholder models of any sort in other relevantintergovernmental organizations. This includes bodies such as WIPO, WTO, the UnitedNations, and the European Union.298 2. A working definition of polycentric groups and practicesPolycentric groups and practices can be seen as a sub-species – in the strongest, ―Type 4‖form -- of multistakeholder models. A polycentric governance system is an arrangementto organize political matters in a way that involves local, national, regional andinternational agents and institutions on equal footing whenever appropriate. Such asystem can be defined as the development and application of shared technical principles,social norms, and rules and practices intended to reach decisions and programs for theevolution and usage of certain resources. All stakeholders have the option to raise theiragenda and discuss urgent issues in shared fora, which are meant to operationalize,structure, and distribute common challenges among the actual decisionmaking andrulesetting institutions.Nobel Prize winner Elinor Ostrom has been a leader in defining and analyzingpolycentric systems of governance. She believes they are one approach that can be usedfor collective-action problems, such as dealing with common pool resources (CPRs).299Such problems typically take the inputs and efforts of many individuals – including―public entrepreneurs‖ -- in order to achieve joint outcomes.300Frischmann says that a commons management strategy is implemented by a variety ofinstitutional forms, which are often mixed (property and regulation, private andcommunal property).301 There are no magic formulas for solving collective-actionproblems, although Ostrom clearly disfavors what she calls ―monocentric hierarchies.‖302The two types of systems typically identified for solving societal problems are the opencompetitive markets for private goods, and centralized, top-down governmentalorganizations. To Ostrom, polycentric governance represents a third and more effectiveway to manage CPRs.303296 William Drake, Multistakeholderism: Internal Limitations and External Limits, MIND, at 68.297 Drake, Multistakeholderism, at 69-71.298 Drake, Multistakeholderism, at 71.299 Ostrom, Polycentric systems as one approach for solving collective-action problems, W08-6, datedSeptember 2, 2008, at 3.300 Ostrom, Polycentric systems, at 1-2.301 FRISCHMANN, INFRASTRUCTURE, at 8.302 Ostrom, Polycentric systems, at 16; 4.303 Ostrom, Polycentric systems, at 1. 48
  • Ostrom‘s research also shows that ―in a self governed, polycentric system, participantsmake many, but not necessarily all, rules that affect the sustainability of the resourcesystem and its use.‖304 The costs of effective self-organization are lower ―when authorityexists to create institutions whose boundaries match the problems faced.‖305 Polycentricsystems may not work if some stakeholders have significant influence, or ―capture.‖Successful CPR systems thus tend to be ―polycentric,‖ with small units nested in layeredsystems.In addition to well-defined boundaries of the resource system, other design principles thatunderlie the most robust polycentric systems include a proportional equivalence betweenbenefits and costs, affected individuals participating in enacting the rules, monitoring,graduated sanctions, conflict resolution mechanisms, minimal recognition of rights, andnested enterprise.306 A variety of governance regimes may achieve sustainability.307 D. Polycentrism in the Internet Space 1. Harnessing rules and players for Middle Layers functionsBack in 1997, the pioneers of the Internet presciently identified a key challenge for itscontinued successful evolution. ―The most pressing question for the future of the Internetis not how the technology will change, but how the process of change and evolution itselfwill be managed.‖308 Even though standards can have public interest effects, they areestablished not by legislatures but by private standards-setting organizations like theIETF.309 This raises many questions about the role of values in protocol design, theprocesses that can bestow legitimacy on these organizations, and government‘sresponsibility to encourage certain kinds of standardization processes.310 As Lessig putsit, ―code writers are increasingly lawmakers.‖311In order to operate the Internet relies on many thousands of technical standards,developed by a diversity of standards organizations.312 There is no ―top-level SteeringGroup‖ to control the Internet; instead there are ―multiple collaborating standards andgovernance bodies that participate in various aspects of the Internet Ecosystem dependingupon their expertise and focus.‖313 Further, the Internet‘s traditional institutions areformal and informal standards and norms, developed by organizations like IETF, andguided by, for example, the Internet Architecture Board (IAB). These standards andnorms have been buttressed, or in some cases thwarted, by laws, treaties, and regulations304 Ostrom (2008), at __.305 Ostrom, Polycentric systems, at 3.306 Ostrom, Polycentric systems, at 12-16.307 Ostrom, Polycentric systems, at 7.308 Leiner et al, The past and future history of the Internet, at 108.309 DiNardis, Internet Governance, at 7.310 DiNardis, Internet Governance, at 7.311 LESSIG, CODE, at 79.312 CDT, ―Voluntary Standards,‖ at 2.313 Letter from Sally Wentworth and Lucy Lynch, ISOC, to Annie Sokal, NIST (undated) at 3. 49
  • of various nations. This multiplicity of institutions and organizations seems to map wellto the polycentric governance principles and ―collective choice arena‖ enunciated byOstrom.As explained earlier, the appropriate way to think about the Internet as a CPR is to focuson the logical layers protocols. And the best way to incorporate Ostrom‘s insights aboutgoverning CPRs is to examine the specific institutions and organizations that match upwith her preferred definition. The ―Ostrom test‖ for whether the Internet‘s Middle Layersfunctions constitute a CPR include: (1) actors who are major users of the resource (2)involved over time in making and adopting rules within collective choice arenasregarding (a) the inclusion or exclusion of participants, (b) appropriation strategies, (c)obligations of participants, (d) monitoring and sanctioning, and (e) conflict resolution.314Initially that fit appears to be a good one.Much like the design principles that define the Net‘s architecture, the institutions andorganizations that have sprung up to help define these attributes have their ownphilosophy of process: decentralized, open, transparent, consensual, and peer-reviewed.This should not be surprising; as Ostrom has documented, the probability is high that asmall community with shared values will develop adequate rules to govern itsrelationships.315 For example, de la Chapelle sees certain common elements in ICANN,and in the Internet Governance Forum (born in 2006), such as openness, transparency,equal footing, bottom-up agenda setting, an iterative consultation process, a governanceworkflow, self-organization, links with legitimating authority, self-improvementprocesses, a pattern of open forum/topic-based working groups/steering groups, and alocal replication ability.316 He also notes the potential implementation pitfalls: ensuringtruly inclusive participation, fighting information overload, synthesizing discussions,preventing capture, composing diversified working groups, creating neutral steeringgroups, reaching closure, and building legitimacy.317A separate question is whether we need guiding principles for Internet governancebodies. Solum believes, for example, that Internet governance institutions should bebiased in favor of maintaining the integrity of the layers.318 However, that view is notuniversal. David Clark and others note that there is a fundamental disagreement whether,going forward, Internet architecture should incorporate inherent values that have beenwidely accepted through societal debate, or be adaptable to a wider range of stakeholdervalues in an evolving societal context.319 They suggest a new ―Internet Science‖ thataccounts for both issues of technical optimization and making choices in the larger legaland social context in which the Internet is embedded.320 David Clark and his various co-authors also have proposed a ―tussle‖ space that would inform both the choice of design314 Ostrom, Polycentric systems, at 8.315 ELINOR OSTROM, UNDERSTANDING INSTITUTIONAL DIVERSITY, at 27.316 De La Chapelle, Multistakeholder Governance, at 19-22.317 De La Chapelle, Multistakeholder Governance, at 22-24.318 Solum, at 50.319 Ian Brown, David Clark, Dirk Trousson, Should Specific Values be Embedded in the InternetArchitecture?, November 30, 2010.320 Brown, Clark, and Trousson, Specific Values, at 6. 50
  • features and principles, and the institutions and organizations selected to host the variousdebates.321 The concept is that the Internet should be modularized along ―tussleboundaries,‖ so that interested stakeholders can debate and decide the best places in thenetwork for control decisions to be made.322Finally, various interested groups also are considering ways that the Internet can and willevolve over time. ISOC, for example, has discussed four different scenarios that coulddevelop over the next decade (Porous Garden, Moats and Drawbridges, Common Pool,and Boutique Networks), based on whether the future is more generative or reductive,command and control or decentralized and distributed.323 A variety of researchers havebeen debating whether the Net should continue evolving through incremental changes, orwhether a ―clean-slate approach‖ is preferable using a new design architecture.324The important point is that these types of conversations originate and can be settledwithin the right community of stakeholders, and not imposed from outside, or above. Inthe Internet world, engineers have good reason to believe that they possess morelegitimacy than governments to make basic technical decisions about how the Netoperates.325 2. IETF as polycentric role modelSince its inception the IETF has operated at the center of the standards developmentcommunity for the Internet. The IETF has standardized, among hundreds of otherprotocols, IP, TCP, DNS, HTTP, and BGP.326 The organization has succeeded in gainingwidespread adoption for its specifications, based on a strong set of social norms and aneffective procedural regime for developing standards.327 Although each standardsdevelopment mechanism has its positive and negative attributes, Werbach finds that ―theIETF is rare in its ability to function so effectively despite its radically decentralized andopen structure.‖328Rough consensus and running code have been the foundational principles of the IETF‘s321 Marjorie Blumenthal and Clark, Rethinking the design of the Internet: The End-to-End Arguments vs.the Brave New World (2001), at 99 (―There is a continuing tussle between those who would imposecontrols and those who would evade them…. This pattern suggests that the balance of power among theplayers is not a winner-take-all outcome, but an evolving balance.‖); David Clark, Karen Sollis, JohnWroclawski, and Robert Braden, Tussle in Cyberspace: Defining Tomorrow’s Internet (2002).322 Brown, Clark, and Trousson, Specific Values, at 4. As one pertinent example the authors cite the need toconduct ―deep control points analysis‖ to address where ultimate network control resides for matters like IPaddress assignment, user-selected routing and DNS resolution, and new information-bearing layerunderlays. Id. at 5. See also David Clark, Control Point Analysis, TPRC Submission, Version 2.2,September 10, 2012 (control point analysis can help us understand the power relationships created byspecific design decisions surrounding the Internet).323 Internet Society, Internet Futures Scenarios (October 6, 2009).324 Doria, Study Report, at 35-58.325 Werbach, Higher Standards, at 200.326 CDT, ―Voluntary Standards,‖ at 2.327 Werbach, Higher Standards, at 200.328 Werbach, Higher Standards, at 193. 51
  • work since at least 1992.329 The IETF itself ―has found that the process works best whenfocused around people, rather than around organizations, companies, governments, orinterest groups.‖330 Historically this cooperative approach was used by the engineers whobuilt the ARPANET, the Packet Radio and Packet Satellite networks, and the Internet,who worked on open mailing lists, and swiftly took technical decisions based onpragmatism and the best argument presented. All genuine Internet institutions like theIETF have developed a track record that proves the effectiveness of this open cooperativeapproach for technical matters.Some argue that the RFC process mirrors the peer review process in academia, and thisshould be considered the equivalent of scholarly publication.331 In fact, the process fordrafting, sharing, and adopting RFCs is ―tough and effective,‖ with more peer reviewersthan any common scholarly publication system,332 ―but with additional emphasis onopenness and transparency.‖333 There should be little surprise that ―the RFC process is amodel that has been taken up for use by decision-makers working on other large-scalesociotechnical systems.‖334Of course, like any entity comprised of human beings, IETF has its flaws. With thesuccess of the Internet has come a proliferation of stakeholders, ―now with an economicas well as an intellectual investment in the network.‖335 One obvious question that arisesis the extent to which corporate/commercial interests have begun influencing the IETF‘swork, and whether or not this is a bad thing. Controlling a standard is competitivelyvaluable, so firms can be expected to engage with standards bodies in ways calculated toserve their own interests.336 Standards wars can be controlled by large corporateinterests,337 and the processes themselves contested by the participants.338 Someparticipants in its working groups have complained, or even quit, following clashes withothers purportedly representing less innovative commercial interests.339At the same time, standards bodies typically have no enforcement authority. 340 Rather, ―itis the nature of the Internet itself as an interoperable network of networks that enforces329 Barwolff, at 139.330 RFC 3935, at 3.331 Brian E. Carpenter and Craig Partridge, Internet Requests for Comments (RFCs) as ScholarlyPublications, ACM SIGCOMM Computer Communications Review, Vol. 40, No. 1 (January 2010) 31.332 Carpenter and Partridge, RFCs, at 31.333 Carpenter and Partridge, RFCs, at 32.334 Sandra Braman, The Framing Years: Policy Fundamentals in the Internet Design Process, 1969-1979,presented at TPRC, October 2010, at 2.335 Leiner et al, The past and future history of the Internet, at 108.336 Werbach, Higher Standards, at 199.337 INTEROP, at 164-165.338 INTEROP, at 165-166.339 See, e.g., Eran Hammer, ―OAuth 2.0 and the Road to Hell,‖ hueniverse (posted July 26, 2012) (―At thecore of the problem is the strong and unbridgeable conflict between the web and the enterprise worlds.‖Most of the individuals who participate ―show up to serve their corporate overlords, and it‘s practicallyimpossible for the rest of us to compete.‖).340 Bernbom, Analyzing the Internet as a CPR, at 15. 52
  • compliance with standard protocols.‖341 In other words, the true source of the ―war‖ maylay outside the Middle Layers routing and addressing functions, in the marketplace itself.―There is a potential conflict for these corporations between cooperation (using standardprotocols to interoperate with the rest of the Internet) and competition (employingproprietary technology to gain market advantage).‖342 Bernbom believes their influence―is less through their participation in standards-making bodies‖ and more through theproduct-oriented decisions they make about what protocols to use and what standards tofollow in the design of network hardware and software.343Past history yields examples of how major economic consequences can flow fromindustry‘s selection of a particular Web standard.344 In 1997, RFC 2109 was adopted,which states that a cookie generated by a particular website can only be retrieved by thatwebsite. Nonetheless, DoubleClick‘s workaround project created a ―third party‖ cookiethat can follow users from site to site. That approach was blessed by RFC 2965, and withthe support of the advertising industry won the day. On the other hand, the use ofNetwork Address Translation (NATs) in broadband networks has flourished, despite thefact that it has never been formally adopted as a standard. Indeed, the IETF has treatedNATs as a violation of the Internet‘s end-to-end architecture, perpetrated by industry.345One also can argue that open processes like those employed by the IETF actually areclosed to most people, even those with significant interest in the proceedings, given thecost in time and money to participate. Many Internet governance debates and issues arenot visible to the public.346 Froomkin believes that the power to select or enforcecommon standards is distributed extremely unequally.347 Some also cite the need for―deepening democracy‖ so that civil society can fully participate in Internet governanceprocesses and decision-making; the key for them is realistic accessibility.348 On the otherhand, there are very effective IETF participants who never come to the face-to-facemeetings, and participate instead through the online processes. E. A Modest Proposal: Modular GovernanceThe notion that ―institutions matter‖ is particularly relevant to CPR goods, where classicmarket mechanisms cannot function in efficient ways.349 This suggests that neithergovernments (political power) nor markets (economic power) are the best types ofexclusive arrangements for Internet governance issues involving Middle Layers341 Bernbom, Analyzing the Internet as a CPR, at 15.342 Bernbom, Analyzing the Internet as a CPR, at 14.343 Bernbom, Analyzing the Internet as a CPR, at 14.344 Nissenbaum, From Preemption to Circumvention, at 1381-82.345 Doria, Study Report, at 14. See e.g. RFC 2993 (NAT has negative architectural implications);DeNardis, ―Internet Governance,‖ at 14 (The e-2-e principle has waned over the years with the introductionof NATs, firewalls, and other intermediaries.).346 DeNardis, ―Internet Governance,‖ at 17.347 Froomkin (2010), at 232.348 Anriette Esterhuysen, A Long Way to Go, Civil Society Participation in Internet Governance, MIND, at54.349 Hofmokl, at 228. 53
  • addressing and routing functions. Obviously neither source of authority and influencecan be successfully eluded, but a third approach is possible, imbued with the designprinciples and processes and bodies that can optimize the Net‘s enabling power.I have a few suggestions on how to incorporate multistakeholder models and polycentricgovernance into the Internet space.First, applying the Internet policy principle to the logical layers functions brings a clearimplication: national and international political bodies should defer to the existing andevolving community entities and processes already at work in the Net space. Asindicated earlier, maximum deference should go to the Middle Layers functions, the pure―public commons‖ that constitutes the Internet‘s core attributes. That technicalcommunity is actively engaged – via a significant degree of transparency, participation,expertise, and accountability -- in figuring out where and how to draw the line betweenthe way the Net has been operating, and where it will go from here. As Werbach puts it,―there is simply no reason for government to interfere with this system. In fact, to do sowould risk destabilizing the Internet industry.‖350 In Benkler‘s apt phrase, policymakersshould practice ―regulatory abstinence,‖351 as part of an international ―keep out‖ zone.Second, we must recognize that the Internet is not just global, but also national andregional and local.352 We could consider recognizing a more explicit system of nationaloverlays and underlays (aimed respectively at upper and lower layers activities) toexisting global design (logical layers functions). In the United States, for example, theFCC could continue to serve as the lead agency for the physical infrastructure layers ofaccess networks, while the Federal Trade Commission would continue in its current roleas supervising activities in the content and applications worlds.353 This approach wouldhelp protect the integrity of the Middle Layers, free from government interference.One possible way to approach this institutional underlay/overlay challenge is to createwhat Crepin-Leblond calls a ―merged hybrid system,‖ where a bottom-up process forInternet governance interfaces with the more rigid top-down decision-making processesof national governments.354 ―Rather than risking a head-on collision, such dialoguewould encourage those involved to focus on reducing friction by defending the line thatboth groups are acting in the ‗public interest.‘‖355 The differences between the two sidescould be seen as more like a semi-permeable boundary than an actual fixed separation.The multistakeholder process can shine a light of transparency on government processes,forcing accountability, while the government can provide ―catalysis, reach, competence,350 Werbach, Higher Standards, at 200.351 BENKLER, WEALTH OF NETWORKS, at 393.352 RICHARD COLLINS, THREE MYTHS OF INTERNET GOVERNANCE (2009), at 9-10.353 Werbach believes that the FCC should recast itself as a standardization organization in virtuallyeverything it does. Werbach, Higher Standards, at 206. Certainly the FCC can and should retrench to amore limited but deep role of overseeing network infrastructure, conducting conflict adjudication, andincenting industry standards.354 Olivier M.J. Crepin-Leblond, Bottom Up Vs. Top Down: ICANN’s At Large in Internet Governance,MIND, 60, at 62.355 Crespin-Leblond, Bottom Up Vs. Top Down, at 62. 54
  • support, and a framework for the bottom-up multistakeholder process to thrive in.‖356Perhaps we should look for multilateral recognition of ISOC and ICANN as legitimateinternational bodies, along with the IAB, IETF, and others.Third, it may make sense to begin talking about the best way for the U.S. Government, asa national political body, to oversee – but not interfere with -- the polycentric governanceprocess and output for the logical layers of the Internet. In essence, democraticgovernment can help set the overarching social goals for society, while the ―technocracy‖of the Net sector can devise the technical means. One potential nominee for thisgovernmental role is the National Institute of Standards and Technology (NIST), anagency of the Department of Commerce. NIST already has been involved in establishingrecommendations for elements of Internet policy, at least from the U.S. perspective.357Perhaps the first step would be to shift the existing Department of Commerce role in theICANN agreement to NIST. This could be a possible prelude to allowing ICANNeventually to function more on its own as a polycentric governance body.VII. Putting It All Together: Examples of Internet Policy in Three DimensionsMost fashioners of regulation seek to discipline the behavior of certain entities, in orderto make them do what otherwise they would refrain from doing, or refrain from whatotherwise they would prefer to do.358 As we have seen, the forms and devisers ofregulation of economic and other activities range widely, from direct mandates throughgovernment intervention to a host of ―softer‖ institutions and organizations. This isespecially the case in the Internet space.Zittrain memorably said that silver bullets belong to the realm of the appliance. 359 Manyregulatory interventions by government in the Internet space tend to be either under- orover-inclusive, and traditional prescriptive regulation often can lead to both outcomessimultaneously.360 In the case of attempts to discipline the market practices of broadbandproviders, for example, prescriptive regulation risks both over-regulating legitimatenetwork functions, and under-regulating undesirable behavior.361 The risks actually aremore pronounced in the larger Internet arena constituted by the Middle Layers. Solumpointed this out in his own work, observing that the challenge to would-be regulators is todevise solutions that avoid being under-inclusive or over-inclusive, do not lead to356 Crespin-Leblond, Bottom Up Vs. Top Down, at 62. This approach may work well, for example, whenconfronting the reality that the United States continues to be involved in at least one logical layers function,the Domain Name System, through its contractual agreement with ICANN. See COLLINS, THREEMYTHS, at 160-61 (describing clash between US control over ICANN and other nations‘ claims forauthority over the Internet).357 See, e.g., Department of Commerce, National Institute of Standards and Technology, Recommendationsfor Establishing An Identity Ecosystem Governance Structure, February 2012 (report recommends a privatesector-led governance framework for an ―Identity Ecosystem‖).358 Whitt, Adaptive Policymaking, at 509.359 ZITTRAIN, FUTURE OF THE INTERNET, at 152.360 Whitt, Broadband Policy, at 469; see also ZITTRAIN, FUTURE OF THE INTERNET, at 150.361 Whitt, Broadband Policy, at 511. 55
  • ―mission creep‖ by the authority in charge, and deal adequately with the prevailing lackof information about impacts.362According to Kingdon, ideas tend to float freely through the ―policy primeval soup,‖waiting for an opportunity to become relevant as possible solutions to a problem. Hegoes on to explain that ―sometimes ideas fail to surface in a political community, notbecause people are opposed to them, not because the ideas are incompatible withprevailing ideological currents, but because people simply found the subjectsintellectually boring.‖363 We need to take that challenge head on. Below we will brieflyexamine a few examples where we can benefit from fitting a three-dimensional approachto a pressing policy concern that affects the Net‘s operations. A. Copyright Protection over Rogue Websites: SOPA/PIPAAs mentioned in Part I, Congress by early 2012 was on the verge of passing the SOPAand PIPA bills. The Internet Blackout Day has the desired effect of putting off furtherconsideration of the legislation, although the political process never yielded a satisfactorydebate over the technical aspects of the proffered solution, or any viable alternatives.Invoking the Internet policy principle, both SOPA and PIPA suffered from the samecommon defect: seeking to regulate IP layer attributes (Middle Layers functions) in orderto address content layer concerns (Upper Layers activities). Attempting to map IPaddresses to content layer information is a classic layers-violation, one that is both over-inclusive (by harming substantial innocent uses) and under-inclusive (by failing toaddress technically-feasible alternatives).364 One also can see the proposed action as aviolation of end-to-end design, as it would compel new intelligence to be placed in thecore of the network, to the detriment of some user activities at the outer edges. Inaddition, the drive to greater voluntary connectivity within the Internet could bedampened. Finally, IP‘s ―best effort‖ delivery mechanism could be thwarted byhindering the carriage of certain forms of content over other forms.Interestingly, the short letter to Congress from the Internet engineers referenced earliermade very similar points.365 The engineers agreed that DNS filtering and other functionalmandates utilize highly problematic targeting of network functions. They also concurredthat the bills would be both ineffective (because many work-arounds are possible) andoverinclusive (because many legitimate uses and users would be adversely affected).366In other words, Congress was considering legislation that, while laudable in its goals, wasaimed at the wrong functional target: the DNS and other logical/applications layersoperations. The legislation suffered from a lack of fitness. Nefarious practices byscofflaws would go unhindered and unpunished, while substantial innocent uses would362 Solum, at __; see also Whitt, Broadband Policy, at 512.363 KINGDON, AGENDAS, at 127.364 Solum and Chung, at 641.365 Vint Cerf, Paul Vixie, Tony Li et al., An Open Letter from Internet Engineers to the United StatesCongress, December 15, 2011.366 An Open Letter, at 1. 56
  • suffer. Moreover, what most users might define as ―most fit‖ on the Internet – the shortand long tails of Internet-based activity -- also would be degraded. Other engineeringexperts chimed in to readily identify both the under-inclusion and over-inclusion risksfrom the PIPA legislation.367Some in the Internet community endorsed a different policy solution: the so-called―follow the money‖ approach. This proposal was contained in an alternative ―OPEN‖bill, which discarded the functional elements of the SOPA and PIPA bills.368 In additionto moving enforcement to the International Trade Commission (ITC), the bill zeroes in ondisrupting the chief incentives – the lucrative nature of stealing and selling someoneelse‘s content – by empowering financial institutions to deny access to funding andcredit.369 The chief virtue of such an approach is that it shifts the focus from the logicallayers to the more appropriate content and applications layers, thereby avoiding many ofthe undertargeting and overtargeting technical concerns raised by the Net engineers.While many in the Net community applauded the OPEN bill as clearly superior to theSOPA/PIPA model, Yochai Benkler earlier had challenged this type of approach as adangerous path that could lack due process.370 Whatever its specific merits, however, the―follow the money‖ scheme at least begins to move the dialogue to the right place in thenetwork. It constitutes the right type of solution, even if perhaps not the right solution.Beyond technical concerns with SOPA and PIPA, the institutional device being employed– a federal statute – and the organizational bodies being empowered – the federal courts –constitute the more traditional top-down ways that policymakers think about addressingperceived problems. One can posit whether the fit of players and rules here also is not sooptimal, with regard to highly technical elements of the Internet, operating in evolvingand fast-moving standards-setting processes. B. Global Internet Governance(s)2012 brings a new Internet policy challenge to the forefront: what is the appropriateglobal organizational body to consider questions of Internet governance? As we have367 Steve Crocker, David Dagon, Dan Kaminsky, Danny McPherson, and Paul Vixie, Security and OtherTechnical Concerns Raised by the DNS Filtering Requirements in the Protect IP Bill, May 2011, at 10-13.See also Allan A. Friedman, Cybersecurity in the Balance: Weighing the Risks of the PROTECT IP Act andthe Stop Online Piracy Act, The Brookings Institution, November 15, 2011 (The legislation‘s attempt to―execute policy through the Internet architecture‖ creates very real threats to cybersecurity, by bothharming legitimate security measures (over-inclusive) and missing many potential work-arounds (under-inclusive).368 The Online Protection and Enforcement of Digital Trade Act (OPEN), introduced December 17, 2011by Senator Ron Wyden, and January 18, 2012, by Congressman Darrel Issa.369 A similar approach is employed in the Unlawful Internet Gambling Act of 2006, in which US-basedbanks and credit card companies are prohibited from processing payments to online gambling sites.370 Benkler resists the notion that disrupting payments systems is a good answer; he sees it instead as a new,extralegal path of attack that also targets entire websites rather than specific offending materials. YochaiBenkler, WikiLeaks and the PROTECT IP Act,: A New Public-Private Threat to the Internet Commons,Daedalus 140:4 (Fall 2011). He particularly decries the employment of private firms offering criticalfunctionalities to a website – whether DNS, cloud storage, or payment systems – to carry out the attacks,outside the constraints of law and the First Amendment. 57
  • seen, standards bodies like the IETF long has been that kind of organization for a host ofInternet protocols, while ICANN presides over the few core functions of the DomainName System and IP address allocation. Nonetheless the InternationalTelecommunication Union (ITU) now seems poised for the first time to take on self-delegated regulation of Internet-based activities.For nearly 150 years, the ITU has been a leading global body governing internationaltelecommunications standards and traffic. As an arm of the United Nations, the ITUengages in government-to-government negotiations over how telecom traffic betweencountries is to be regulated. In particular the ITU has facilitated spectrum coordination,assisted in standard setting, and played a critical role in encouraging deployment andadoption of communications networks in developing economies.The International Telecommunications Regulations (ITRs) – the governing, bindinginternational regulations for circuit-switched telecoms providers -- have not been revisedsince 1988. In December 2012 the ITU will be convening its World Conference onInternational Telecommunications (WCIT) conference to consider a number of member-nation proposals for such revisions. At it turns out, a number of those proposals for thefirst time would put the ITU in the role of overseeing and even regulating internationalInternet traffic.371 As just one example, countries like China and Russia in the past haverecommended an ―International Code of Conduct for Information Security,‖ which wouldmake all Internet governance decisions subject to government approval.372 Thesemember-nation proposals are not out of keeping with the ITU‘s recently-stated ambitions.In 2010, a majority of the ITU‘s 193 countries resolved to ―increase the role of the ITU inInternet governance so as to ensure maximum benefits to the global community.‖373There is no great secret why the ITU is considering taking these steps into Internetgovernance. The world is moving quickly to packet-switching and the Internet Protocol –in other words, the basic architecture of the Internet. The ITU believes it stands to loseimportance as more and more of the world‘s telecom traffic and infrastructure escapes itsjurisdiction.374As the WCIT looms, the ITU‘s proposed involvement in the Internet space has beenwidely decried, on both substantive grounds,375 and on process grounds.376 ISOC alsohas weighed in with its concerns.377371 [cite examples]372 Doria, Study Report, at 32.373 ITU Resolution 180 (Guadalajara, 2010).374 COLLINS, THREE MYTHS, at 161.375 See, e.g., Brian Pellot, ―UN agency threatens internet‘s future,‖ freespeechdebate.com, July 31, 2012(proposed ITU regulations could shift the Net‘s decentralized, bottom-up approach to top-down control bygovernment actors with their own national agendas).376 Patrick S. Ryan and Jacob Glick, ―The ITU Treaty Negotiations: A Call for Openness andParticipation,‖ NANOG 55 (June 2012).377 Internet Society Board of Trustees, ISOC News Release, ―ISOC Express Concern about the PotentialImpact of the WCIT on the Internet,‖ August 7, 2012. 58
  • Certainly the ITU is a problematic enough organization on the process front. As a UNbody based on a ―one country, one vote‖ structure, limited involvement by entities whoare not national governments, little expertise in technical Internet matters, and littletransparency in process, the ITU flunks the test for broad-based polycentric governance.Moreover, even the ITU‘s claim to uniquely represent the nations of the world is beliedby the fact that it has more non-voting business members than member states.378In fact the ITU is a notable example of an international body that has failed to embrace amultistakeholder model.379 Just a few years ago, the body rejected some modestsuggestions on ways to increase the ability of outside stakeholders to participatemeaningfully in the ITU‘s processes.380 In the run-up to the WCIT, the ITU has shownsome signs of willingness to open its processes to third parties.381 However, those effortsare limited, and have been roundly criticized as insufficient. 382 The ITU has been calledon to undertake far more significant procedural reforms, including at minimum widelysharing the treaty conference proposals, opening the proposals database for review andcomment, and implementing a robust model for multistakeholder participation andrepresentation.383 To date no such reforms have been forthcoming.The substantive elements also are highly troubling. As a fundamental matter, the verticalsilo model of regulation long (and still) championed by the ITU and most telecomregulators clashes directly with the layered, end-to-end nature of the Internet.384 Morespecifically, certain proposals would make ITU-T recommendations mandatory, therebysupplanting the traditional voluntary multi-stakeholder bodies for standardsdevelopment.385 Other proposals would create a new model for Internet interconnection,disrupt existing methods of Internet naming, numbering, and addressing, regulate IProuting, and extend the ITRs to Internet companies.386 The European telecom carriershave submitted their own proposal to expand the ITRs to include Internet connectivity,and seeking a ―sending party network pays‖ compensation scheme that essentially leviesa content fee on foreign websites.387 In its understated manner, ISOC has ―expressedconcern‖ that these and other measures ―could have a negative impact on the Internet.‖388378 COLLINS, THREE MYTHS, at 178.379 Drake, Multistakeholderism, at 72.380 Drake, Multistakeholderism, at 72.381 ITU Press Release, ―Landmark decision by ITU Council on proposal for consultation and open access tokey conference document, July 13, 2012. Of course the mere fact that the ITU itself labeled this modestaction as ―landmark‖ speaks volumes.382 See, e.g., Carl Franzen, ―U.N. Telecom Agency Releases Secret Treaty, Critics Unswayed,‖ TPMIIdeaLab, July 13, 2012.383 Ryan and Glick, ―ITU Treaty Negotiations,‖ at 22.384 Ryan and Glick, ―ITU Treaty Negotiations‖ at 2. See generally Whitt, A Horizontal Leap.385 Five of the world‘s leading Net governance groups – IETF, W3C, IEEE, ISOC, and IAB – recentlydeclared their commitment to voluntary standards as part of their affirmation of ―OpenStand.‖ CDT,―Voluntary Standards,‖ at 4; see www.open-stand.org.386 ISOC Press Release, ―ISOC Express Concern,‖ at 1.387 ETNO, CWG-WCIT12, Contribution 109, Council Working Group to Prepare for the 2012 WCT, June6, 2012. See also Laura DeNardis, Governance at the Internet‘s Core: The Geopolitics of Interconnectionand Internet Exchange Points in Emerging Markets,‖ TPRC 2012, August 15, 2012, at 1 (ETNO proposalamounts to a telephony-style content tax).388 Id. 59
  • The WCIT thus seems to violate all three dimensions of the Internet policy framework:the wrong functional/operational targets (Upper Layers activities and Middle Layersprotocols), the wrong institutional tool (an international telecom treaty), and the wrongorganizational body (the UN‘s chief telecom regulator). The solution is straightforward:the ITU should stick to what it has been doing with regard to allocating radio spectrumand advancing broadband networks in emerging economies (both classic Lower Layersactivities), thereby limiting itself to the boundaries of international telecommunications.The ITU certainly should refrain from getting involved in other domains outside its scopeof substantive expertise and legitimacy. C. Broadband Network ManagementOver the past decade, public discussions over open Internet policy have focused on thepotential need for laws and/or regulations to discipline the market behavior of broadbandnetwork access providers. A significant part of the public policy concern stems from thesizable ―spillovers gap‖ between the commercial value of broadband infrastructure, andits overall social value as an optimal platform for accessing the Internet. With privately-owned broadband networks in particular, there is always the possibility of conflicts ofinterest between the platform owner and the platform users.389Until recently few on either side of the debate attempted to raise alternative institutionalapproaches, including possible involvement of standards and best practices promulgatedby expert network engineers. I devoted a paper to this very question, and suggested avariety of ―tinkering‖ measures that could be adopted as part of a national broadbandpolicy.390 While national policymakers should be cautious, I still saw a legitimate placefor government to ask questions, monitor markets, and where necessary impose a mix ofstructural and behavioral solutions to problems in the broadband market.391One example of multistakeholder thinking is to develop a new type of body staffed byengineers representing the Internet‘s leading private and public stakeholders. In 2010,Google, Verizon, and other entities founded BITAG (Broadband Internet TechnicalAdvisory Group).392 The entities involved share the general conviction that thesometimes-thorny issues of commercial management of broadband networks best should389 Werbach, Higher Standards, at 185. Van Schewick goes further to claim that broadband providersactually control the evolution of the Internet, and that ―it is highly unlikely they will change course withoutgovernment intervention.‖ VAN SCHEWICK, INTERNET ARCHITECTURE, at Part II, Chapter 2. Mybroadband paper recognizes the concerns but comes to a slightly different conclusion. Whitt, BroadbandPolicy, at __.390 Whitt, Broadband Policy, at 515-532. Rejecting what I called the ―prescriptive‖ approach to broadbandregulation, I suggested a number of ―adaptive policy projects‖ focused on different implements, including:feeding the evolutionary algorithm by adding market inputs; fostering agent connectivity by harnessinginfrastructure; shaping the fitness landscape by utilizing market incentives; and enhancing feedbackmechanisms by creating more transparency and accountability. Id.391 In this paper I urge outright deference to the Net‘s working institutions and organizations, due to what Isee as the sizable ―competency gap‖ between that world and current political systems. The Net communityshould be allowed to continue attaining a relative degree of success in minding its own affairs.392 www.bitag.org. 60
  • be considered, at least in the first instance, by technical experts through open membershipand transparent processes, and not the government. The premise is to address perceivedproblem at the right place in the network (as determined by Internet engineers), using theright policy implement (best practices), wielded by the right type of entity (an MSO).For example, the end-to-end design attribute does not preclude making the broadbandnetwork more secure by moving certain security functions within the network ―core.‖393The FCC, to its credit, seems to understand the role that entities like BITAG can play incurrent broadband policy, and Chairman Genachowski has spoken favorably about it.394Should the D.C. Circuit reverse the FCC‘s Open Internet order,395 it will be interesting tosee whether and how such types of entities and institutions could help fill the resultingpolicy gaps.396 The jury is still out on the ultimate success of BITAG, and its workingstructure and output is akin to a typical MSO, rather than a more radically open anddecentralized polycentric entity like the IETF. Nonetheless this type of body has thepotential to avoid highly detailed tech mandates from regulators, relying instead on themore flexible, adaptable, and inclusive approaches that traditionally have been utilized inthe standards world.Some resist the notion of any significant government role in the broadband space. JimCicconi from AT&T has even called out the U.S. Government for what he considershypocritical behavior, seeking to ward off the ITU from any involvement in the Internetwhile at the same time embracing FCC regulation of data roaming, open Internet/networkneutrality policies, and IP-to-IP interconnection.397 He claims that ―there is a principle atstake here. Either the government regulates the Internet, or it doesn‘t. And the U.S. can‘thave it both ways…. Either we believe in the principle or we don‘t.‖398Of course, much as online companies often hide behind the ―don‘t regulate the Internet‖mantra to avoid government involvement in their commercial activities, the broadbandcompanies have appropriated it for similar reasons. Cicconi‘s statement belies some(intentionally?) fuzzy thinking. The broadband companies provide access to the Internet,it is true, but the basis for the FCC‘s (and ITU‘s) jurisdiction in this space comes from theLower Layers functionalities -- the Layer 1 physical infrastructure and Layer 2 datalinkprotocols that allow networks and devices to connect in the first place. Here, the Layer 3functionality provided by IP does not equate to the Internet; instead it serves as theubiquitous bearer platform on top of the broadband network functions, one that can justas easily be used to transmit proprietary voice service, or private data capabilities.393 FRISCHMAN, INFRASTRUCTURE, at 356; see VAN SCHEWICK, at 366-68.394 [citation]395 [citation]396 Kevin Werbach for one does not believe it is optimal for the FCC to be deeply engaged in the details ofbroadband network management. Instead, a form of ―regulation by standardization‖ may be a moreeffective means for the FCC to implement a positive agenda, where the Commission looks for and blessesopen standards solutions. Kevin Werbach, Higher Standards: Regulation in the Network Age, HarvardJournal of Law and technology, Vol. 23, No. 1 (Fall 2009), 179, 212-221.397 Jim Cicconi, ―The IP Platform: Consumers Are Making the Transition … Can Regulators Do theSame?‖ Prepared Remarks at the Communication Exchange Committee, June 15, 2012.398 Cicconi, ―IP Platform,‖ at 3. 61
  • Further, concerns raised in the network neutrality debate about broadband accessproviders stem from users‘ necessary reliance on broadband companies, and their controlover Layers 1 and 2 functionalities, to reach the Internet. All Internet-based activitiesuniquely require an unimpeded last-mile connection. Broadband providers deploycommunications infrastructure that is relatively scarce, profoundly important, and relianton public resources. They also occupy a unique role in the network, one that allows themto transport, inspect, manipulate, and apportion capacity for, what I have called ―otherpeople‘s packets.‖399 Network neutrality regulation can be seen as one specific means ofpolicing the Lower Layers communications interfaces to the Middle Layers and UpperLayers functions.400 However, other solutions are possible. For example, once an ISP ofits own volition has agreed to join its network to the Internet, and seeks to offer userssomething called ―Internet access,‖ it could be held accountable to the terms of thatvoluntary agreement under the permissive bailment duty of care derived from thecommon law.401 Regardless of one‘s viewpoint on the policy diagnosis, we mustunderstand the pertinent technology function if we have any hope of prescribing (orrefraining from prescribing) a contextual solution. Conclusion“If the Internet stumbles, it will not be because we lack technology, vision, or motivationbut because we cannot set a direction and march collectively into the future.”402Back in 1997, our Internet pioneers had it right all along.This paper argues that one can successfully achieve a guiding three-dimensional publicpolicy framework for the online age, founded on the basic principle of ―respect thefunctional integrity of the Internet.‖ In particular this framework should injectconsiderable caution into the policy-making process, by urging political restraint andeven deference. Absent a compelling regulatory interest, and careful consideration oftailored remedies that work in accord with the Internet‘s basic functions, policymakersshould not adopt legal regulations that violate the integrity of the modular, end-to-end,interconnected, and agnostic nature of the Internet. Generally speaking, the morenarrowly the regulation focuses on the Internet functions it is attempting to affect, the lessit will impair other functions, reduce transparency, or harm substantial innocent uses. Byseeking to preserve the functional integrity of the Internet, adaptive policymakers canhelp preserve the emergence of innovation, economic growth, and other salient Interneteffects.399 Whitt, Broadband Policy, at __.400 As I have noted elsewhere, network neutrality regulation is not necessarily the best solution to addressconcerns about broadband provider behavior. Employing an open access model, which would limitpotential regulation to the Layer 2/Layer 3 wholesale interface, arguably is a preferable functional solution(although politically untenable at present) and would constitute a better fit under the Solum layeringprinciple. See Whitt, Broadband Policy, at 518-19.401 Whitt, Broadband Policy, at 504-05. Nonetheless, just because the FCC may possess the legal authorityto regulate broadband access providers does not mean it should be exercised in the first instance.402 Leiner et al, The past and future history of the Internet, at 108. 62
  • Stakeholders also should look to employ the appropriate institutional and organizationaloverlays to address the perceived policy challenge. At its best the government‘s roleshould be to experiment with the optimal background conditions for a dynamic,unpredictable, and evolving Internet environment – and then stand back to let the processitself unfold.Above it all, and returning again to John Kingdon, perhaps the ―idea‖ of the Internet weneed to protect is an actual idea: a fundamental agreement about a way of community andsharing and gaining information, one that is more open, more available, than any othermedium we have seen before. The most basic idea about the Internet is its protocols –literally, its agreed way of doing things. As we have seen, the essence of the Internetitself is in its logical layers functions, and they amount to a genuine public good.Policymakers should give way to the comparative wisdom and efficacy of polycentricprocesses and outcomes.When advocates plead ―don‘t regulate the Internet,‖ they are glossing over the real pointsof significance. Regulation of Internet-based activities is more or less a given; regulationof its Middle Layers functions is not. It is not to say we should refrain completely fromregulating Lower Layers functions – the individual physical communications networksand standards – or Upper Layers functions – the applications and content and servicesand activities -- although there may be very sound reasons to be cautious about suchregulation.403 However, that caution relates more to the workings of the tangibleeconomic markets in those facilities and goods and services, rather than to the logicalessence of the Internet. The Middle Layers are where the design attributes work theirunique magic. That is the GPT level, the CAS enabler, the CPR good. It is wheremultistakeholders can and do operate to best effect, where polycentrism hits its sweetspot.So, while the corrective of ―don‘t regulate the wrong parts of the Internet‖ might soundalmost quaint these days, we should consider retaining its essence, at least for the corefunctions of its design architecture that make sense. My suggested substitute is to―respect the functional integrity of the Internet.‖ We should give due deference to theInternet‘s design attributes because we share the unique values of its creators, we laud the(mostly) amazing outcomes, we honor the half-crazy polycentric processes and playerswho somehow have kept it going, year after year.Now, I am aware that this new formulation can be seen as a public policy equivalent ofgiving way to ―rough consensus and running code.‖ Defer to the relative mess ofdedicated volunteers and sprawling organizations and multilayered processes? But that‘sprobably as it should be. Obviously political and market realities will intrude into thepublic goods space of the Internet‘s logical layers. The U.S. Government still hassignificant connections to ICANN. Dominant industry players will play games withstandards processes. The ITU will try to figure out a way to be relevant in the IP age.403 See, e.g., Whitt and Schultze, Emergence Economics, at __; Whitt, Adaptive Policymaking, at __; Whitt,Broadband Policy, at __. 63
  • More Congressional hearings will be held on building copyright protection into the Net‘sdesign architecture.But if Kingdon is correct, ideas truly do mean something in public policy. And perhapsthe Kingdonian policy window now has been opened just a crack, with a rare opportunityfor all sides to rethink their respective positions. With a lot of effort, and some luck,―respect the functional integrity of the Internet‖ is an idea whose time in the venerable―garbage can‖ of politics finally has arrived. And then, as William James reminds us, apolitical approximation of truth will have happened to a pretty cool idea. 64