Last year – some 15 months ago – the Federal Communications Commission released its much-awaited National Broadband Plan, our nation's first attempt to create a comprehensive blueprint to advance the deployment and uptake of broadband services. While the Plan has been touted in many corners as a major step in the right direction, in reality it is only the first step of many that will be necessary in order to actually meet, if not exceed, its stated aspirations. My talk this morning will focus on where we are now, and (more importantly) where we need to be, in order for broadband to truly become The Dialtone of the 21st Century. I will rely in part on an article I wrote in 2009 on what I call adaptive policymaking, and apply some of its principles to what the FCC has devised.
So before we plunge ahead, first we need to understand how we got to this point.
The FCC didn ’t come up with the concept of the National Broadband Plan on its own, or even have much say in what it should contain. Congress included the idea, the timing, and the basic outline for the Plan as part of the broadband economic stimulus package that was approved in early 2009. So while the FCC likely would have been consulted in some fashion, from the very outset the agency played a backseat role to what members of Congress wanted to achieve in the midst of an economic crisis.
The NBP is especially challenging given these various constraints on the FCC as an effective policymaker during 2009 and 2010. This boils down to: -- the rapid pace of change -- the unique economics of infrastructure -- the urgent need for data -- the lack of adequate time, and -- a steep learning curve for new senior leadership at the agency
Nonetheless, the FCC moved ahead and produced a 360 page report, which you can readily find on the Commission ’s website, www.fcc.gov. The key findings boil down to a single concept: we have a shortage. A shortage: -- of supply -- of demand -- of spectrum for future wireless broadband platforms -- of “stuff” to take advantage of broadband -- of coordination to support “national purposes”
So the FCC promulgated some key goals for its Plan to achieve. Probably the one objective that grabbed the most headlines was the 100 by 100 by 2020 proposal – 100 Mbps for 100 million homes in the next 10 years. With only 50 MHz of spectrum in the pipeline, the FCC also saw the need to free up far more in the coming years. Getting 300 MHz by 2015 will be especially daunting. Aside from these metrics-based goals, the FCC also pronounced a number of “shoulds” related to broadband deployment, uptake, and national purposes like public safety and energy. The Commission also indicated it would reform the USF and ICC regimes in 3 steps over the next ten years.
Last summer the FCC released its proposed schedule of rulemaking proceedings and inquiries to actually implement the Plan – and it is quite impressive. Some 60 individual proceedings. Several of these already have been adopted. Others are still under consideration. But how should we judge the relative merits of the Plan? What are some ways we should be thinking about its possible success, or potential failure?
I ’ ve written several law journal papers which try to explain some ways that policymakers should approach the intersection of law, technology, and economics. One answer is that we need to see the market and government with fresh eyes, not as standalone or even antagonistic entities, but as linked, co-evolving agents in the larger ecosystem. Mark et and State are conjoined. The workings of the economy rely on the government, and vice versa. The question is not whether government necessarily is part of the market, but what that role should be. The policymaker (legislator, regulator, court) should act as an adaptive agent, sensitive to its own cognitive constraints and the dynamism and unpredictability of the market. Thus, the first principle should be that the policymaker must take great caution in his or her actions in the market.
Based on my review of the economic literature, and the potential roles for policymakers to play in the market, I developed for my paper a list of principles that should guide the way policymakers approach their duties, particularly in a technology-rich space like the Internet and telecommunications. -- Cautious -- Macroscopic -- Incremental -- Experimental -- Contextual -- Flexible -- Provisional -- Accountable -- Sustainable
My paper also draws on some recent work in New Institutional Economics dealing with institutions, which can be considered the “ rules of the game. ” A key takeaway is that there is a wide range of potential institutions available to structure market relationships, typically beyond the ordinary expectations of policymakers. These range from laws and regulations to bully pulpits, self-regulation, and even social norms. These institutions involve inherent tradeoffs between values like flexibility and accountability. These institutions can be thought of as occupying a blend of public and private spaces, between governments and markets.
Along with institutions we have organizations: the players in the game. Again, as with institutions, there are a number of potential organizations that can be involved in market processes. For purposes of my paper, I focused largely on the Federal Communications Commission, but other entities, public and private, can be and are involved in the public policy design space.
A Beltway pundit called the National Broadband Plan “a plan to have a plan.” So, what can be said at this relatively early stage in the process? Here are a few ideas.
Communications policymakers face tremendous challenges as arbiters between various powerful factions, operating within the complexity and dynamism and unpredictability of markets. The FCC ’ s potential vulnerabilities include “ regulatory capture, ” “ theory capture, ” and “ informational capture. ” -- Subject to “ regulatory capitalism ” (Kennard): incumbents tend to invest heavily in lobbying -- Failure to adapt (still see the industries in individual silos) -- Use of inflexible and irreversible approaches (such as statutory definitions) -- Forcing new technologies into old paradigms (VoIP) -- Lack of nuanced approach (still use black-and-white assumptions about regulate/don ’ t regulate, with over/under managing) -- Lack of accountability (no mechanism to re-assess market impact of prior decisions) -- Processes: lack data-driven decisionmaking, and subject to “ informational capture ” -- Structure of the agency: still reflects siloed policymaking, has insufficient technical and economic assets internally, and no place for timely case-by-case decisions
Now, we already do know some things, because we can appreciate the background against which the FCC was compelled to act. So here are a few early warning signs that the Commission needs to consider as its implementation process unfolds. These all suggest a heightened attention to the adaptive policymaking principles I described earlier. -- Shoot, ready aim: be cautious and contextual, gathering the necessary data and resources before moving forward on policies -- Alpha or beta: be experimental and provisional, never resting long on any particular assumptions -- Top-down: be incremental and flexible in how the Plan is structured -- Back-loaded: be accountable and sustainable in how the Plan is implemented.
In its filings, Google had proposed to the FCC one way to approach the National Broadband Plan -- as an evolving, iterative process, rather than producing a definitive blueprint for all time. The process should flow naturally from setting objectives, to gathering data, to creating benchmarks, to analyzing available resources, to developing projects, to assessing the impact, and then starting all over again. To its credit, the Plan does adopt this tone in various places. But the real key will be whether this type of framework is actually utilized.
We also need to challenge some of the prevailing assumptions about broadband, some of which amount to misnomers. First and foremost, broadband is infrastructure for both transportation (of bits) and communications (of people), of conveying content (information) and establishing relationships (interactivity). We value bb for what it enables, not for what it is. Broadband is not: -- The Internet: to regulate bb is not to “ regulate the Internet. ” -- Internet access: broadband infrastructure provides access to the Internet as one of its functionalities. -- A content delivery system: one-way video streaming is but one possible functionality; two-way interactivity promises greater benefits. -- A box of widgets: the economics of broadband are different: high-upfront fixed costs, network effects, reliance on public inputs and subsidies, and as communications infrastructure with unique role. -- Your vegetables: we all assume that consumers should want a ton of broadband, but that is only an assumption. We need to consider the demand side of the equation just as much.
I will briefly focus on three areas in the NBP: competition, spectrum, and jurisdiction. The FCC found that 96% of households have access to 2 or fewer wired bb service providers, the definition of a duopoly. The FCC says it is “unclear” if bb competition exists today, but if so “it is surely fragile.” The agency also posited that under one plausible scenario, the cable companies would become the dominant or even sole wireline bb providers. The FCC also found that wireless may not be an effective substitute in the foreseeable future. Nonetheless, the agency failed to follow through on the evidence it collected. On the video side, the FCC noted that Congress in 1996 directed the FCC to foster a competitive market in cable set top boxes; Section 629 of the Act; AllVid proceeding; we ’re still waiting.
On the spectrum front, the FCC decided that the country needs 500 MHz of spectrum by 2020. A nice round number, bound to grab some headlines. But exactly where does it come from? Is this more shoot, ready, aim? We need data-driven decisions, not decisions that lack hard supporting information. A comprehensive spectrum inventory would at least get us the denominator of what we have available from current supply. And fixing mechanisms like secondary markets could help facilitate more market transactions. Also, unlicensed spectrum gets some mention in the Plan, but frankly not as much as it deserves. And recent events only prove out that the Commission and others seem much more focused on conducting incentive auctions for licenses, than setting aside anything useful for unlicensed.
Finally comes the overarching question of whether the FCC even possesses the statutory authority under the Communications Act to carry out much of its Plan. After the Comcast decision was announced last spring, the FCC ’s General Counsel issued a memo explaining how some of the Plan could be jeopardized if the Commission can’t rely on Title II of the Act as the basis for action. The FCC ’s network neutrality decision last December dodged the question by relying once again on Title I. But is the Plan fated to be picked apart, proceeding by proceeding, based on assertions that the FCC lacks the authority to act? We shall see.
Thank you very much.
The FCC ’s National Broadband Plan: A Preliminary Critique through the Lens of Adaptive Policymaking Richard S. Whitt Director/Managing Counsel, Telecom and Media Policy Emerging Communications 2011 San Francisco, CA June 28, 2011
“ The national broadband plan … shall seek to ensure that all people of the United States have access to broadband capability and shall establish benchmarks for meeting that goal.” By March 17, 2010 , the FCC was required to submit to the Senate Commerce Committee and the House Commerce Committee “ a report containing a national broadband plan .” Congress Speaks…
Nine Principles of an Adaptive Stance Cautious Macroscopic Incremental Experimental Contextual Flexible Provisional Accountable Sustainable Humility is essential. The big picture. Evolutionary, not revolutionary. Necessity for experimentation. Well-grounded and context-dependent. The need for flexibility. Favor reversibility. Test, monitor, and honor. Politically adoptable and achievable. Policymakers are beset by powerful influences that favor the status quo over change and progress.
Institutions: Rules of the Game Constitutions Laws Regulations Policies Co-Regulation Bully Pulpit Self-Regulation Codes of Conduct Standards Norms Degree of formality , coercion , accountability , and enforceability .
Organizations: players of the game. Bunch of people playing poker Each player in an entity (corp, policymaker) Organizations: Players of the Game Interaction between players and rules shapes institutional change.
The Plan suffers as part of the “shoot, ready, aim” approach -- ideally we needed data first, then the policy, and then the money spent. Thanks to Congress and the previous FCC, we got the reverse instead.
“ Alpha or beta”?
The FCC needs to be serious about treating the Plan as in terminal beta, always learning and iterating and evolving.
The FCC runs the risk of sounding too much like the top-down specialist, rather than employing a bottom-up approach that relies on states and local communities.
The Plan remains largely aspirational; the heavy lift will be in the many proposed implementing rulemakings, which will take many months and even years to resolve.
Stage 1: Baseline Assumptions Stage 2: Overall Objective Stage 3: Data “Mash-ups” Stage 4: Metric Screens Stage 5: Defined Benchmarks Stage 6: Resource Analysis Repeat Optimal Approach: An Evolving Plan We should aim to facilitate an environment that over time stimulates investment and innovation in -- and usage of -- broadband technologies and applications. The framework should engender a flexible, iterative, and comprehensive process. Stage 7: Focused Projects Stage 8: Interim Evaluation
Broadband Deconstructed “ Communications/transportation/information/ entertainment/ interactivity” infrastructure What it is What it is not The Internet. Internet access. A content delivery system. A box of widgets. Your vegetables.
Recommendation: allocating 500 MHz by 2020; 300 MHz by 2015
Talk of repacking the TV spectrum, less than a year after the DTV transition
Only 20 MHz (at best) of additional unlicensed spectrum
Reliance on auctions: only benefits the big incumbents?
Does repurposing broadcaster spectrum amount to replacing free TV with pay TV?
More shoot, ready, aim?
Why not first do the full inventory of currently allocated spectrum, so we know exactly what we have and how efficiently it is being used?
Tools like secondary markets and technological sharing measures through underlays/overlays could help alleviate presumed spectrum shortages.
Then we should take an inventory of the total potential available spectrum resources, including government spectrum, broadcaster spectrum, AWS III, etc., and determine how best to reallocate for other purposes.
Does unlicensed get short shrift in the Plan?
TV White Spaces may be gone if Rockefeller bill is adopted