Richard Bennett

You Get What You Measure: Internet Performance Measurement as a Policy Tool

  • While broadband network speeds have improved substantially over the last decade, the web’s performance has stagnated from the end user point of view. 
  • The disconnect between broadband and web speeds suggests that the “virtuous circle” hypothesis created by the Federal Communication Commission to justify common carrier internet regulation is false.
  • A system for capturing passive measurements and sharing them among internet service providers, web developers, and other responsible parties may be useful for accelerating the web experience.

The Internet, network neutrality and permission to innovate

[Commentary] The Internet champions “permissionless innovation,” the ability to develop new services without tedious negotiation and approval. As the Federal Communications Commission makes its third attempt to develop a fair, coherent, and lawful regulatory policy for the Internet’s broadband on-ramps, it can either apply this principle or it can adopt Title II -- a contrary rule that once limited the pace of innovation in the historic telephone network.

Much of the Internet establishment, many ordinary citizens, and even some cable network comedians urge implementing Title II without acknowledging the harm it’s likely to cause. The father of net neutrality, Columbia law professor Tim Wu, is an exception: he admits that “excessive regulation can stagnate an industry” even while preferring monopoly-style regulation for increasingly competitive broadband networks.

[Bennett is a Visiting Fellow at the American Enterprise Institute]

What the FCC’s broadband tests really measure

[Commentary] The data in the “Measuring Broadband America” report released by the FCC on June 18th shows that Americans get the broadband speeds they pay for.

The report plainly says (page 14), “This Report finds that ISPs now provide 101 percent of advertised speeds.” This couldn’t be any clearer. The FCC even places this finding in context by contrasting it with the results from their preceding report, “The February 2013 Report showed that the ISPs included in the report were, on average, delivering 97 percent of advertised download speeds during the peak usage hours.”

The FCC report also engages in a rather peculiar exercise of measuring web page loading speeds and attributing them to ISPs. The FCC’s web loading time test actually says more about the web server than it does the network. This is nice data for researchers to have, but it tells us very little about ISP networks. For this to be a meaningful measurement, the FCC would also need to account for CDNs, web server location, and web server response time.

It would be good for the FCC to clearly separate ADSL from VDSL and for it to measure speeds up to 1 Gbps. Once it’s done with that, it’s fine for it to try to get a handle in web servers and interconnections, but it appears that the FCC has a long way to go before it really understands what it’s measuring.

The Wireless Innovation Act: Rational spectrum policy

[Commentary] Senator Marco Rubio has made a strong move for spectrum policy leadership, announcing a plan to introduce three bills that substantially reform America’s spectrum policy.

The Wireless Innovation Act (WIA) directs the National Telecommunications and Information Administration (NTIA), the previously toothless agency that coordinates the use of spectrum by federal agencies, to compile recommendations for the transfer of 200 MHz from the government to the people.

The WIA is not the kind of sweeping reform of the government spectrum status quo that we’re going to need eventually, but it’s a strong step in the right direction that tests the resolve of two key constituencies: Senate Democrats and NTIA. Perhaps the best thing about the WIA is the absolute deadlines it applies to government spectrum users: Three years from the delivery of the NTIA report, they’ll lose their authorization to use the 200 MHz of spectrum (even if they lose the memo.)

This is exactly the kind of sensible, rational, and technologically efficient approach to spectrum that’s long overdue.

[Bennett is a visiting fellow at AEI and was vice-chair of the Institute of Electrical and Electronics Engineers (IEEE) task group]

How to improve federal spectrum systems

[Commentary] I’m developing the idea of creating a Federal Spectrum Service, a government chartered for-profit corporation, to serve as the owner of all federal spectrum.

The FSS would control all federal spectrum use and manage it according to a ten-year plan for reducing the federal spectrum footprint in two stages. In the first stage, the FSS would be required to reduce the federal spectrum footprint by 50%, and in the second stage it would be required to reduce it by 50% once again. The spectrum thus liberated would be auctioned for public use.

Once this mission is accomplished, the FSS would cease to exist unless Congress explicitly re-authorized it to continue in some form. The FSS would have the power to meet this mandate, as it would assume immediate ownership and control of all federal systems that use spectrum directly, either as transmitters or receivers. Therefore, the FSS would be able to replace current systems with new ones that would use spectrum more efficiently and to auction the spectrum it frees up for public uses.

As a single entity with control of federal spectrum use, the FSS would not be affected by agency infighting and the fragmentation of spectrum expertise across the panoply of agencies. If the FSS finds the PCAST Report’s sharing recommendations sensible, it would be able to test them by having agencies share spectrum with each other.

While all of the liberated spectrum would be auctioned, it wouldn’t all necessarily go to the highest bidder. The proceeds from auctioning federal spectrum would easily pay for the equipment upgrades that would make even more spectrum available.

Net neutrality advocates need to get their facts straight

[Commentary] The Federal Communications Commission’s net neutrality rules are based on the false premise that American broadband services are sub-standard compared to those in other countries.

Advocates who buy this notion believe that network price and quality can only be improved by regulatory action that forces providers to make uneconomic investments. Before we can have a rational discussion about network policy, we need to get the facts straight.

Average broadband speeds in five of the top 10 are actually declining, while those in the US are improving. Chairman Wheeler’s Open Internet rules aim to preserve the goose that has laid these golden eggs while protecting America’s innovators and ordinary citizens from the hypothetical harms than can arise in markets with minimal competition.

In short, the proposed regulations permit a degree of experimentation with the pricing of technical services on the Internet provided that the common, baseline service continues to be adequate for the common, baseline set of applications.

The most common complaint emanating from the fainting couches occupied by (the mainly far left) net neutrality advocates is that the proposed regulations don’t go far enough to preserve the Internet as it has always been. This is an odd standard to apply to a technical system notable for its disruption of traditional industries such as music, journalism, travel, and retail.

Net neutrality advocates also worry that Internet Service Providers have incentives to exploit customers and harm innovation, fears inspired by every profit-maximizing business. But these incentives are counter-balanced by conflicting incentives to sign up more subscribers and to provide richer services.

Wake up, FCC: The Internet Protocol transition is now

[Commentary] Some 45 years after design work started on the cellular network and the Internet, the Federal Communications Commission (FCC) issued an Internet Protocol (IP) Technology Transitions Order amounting to a reluctant invitation for trials on the decommissioning of the legacy telephone network.

While the telephone network is no longer the centerpiece of telecommunications in the United States or around the world, the FCC is clearly anxious about turning it off, probably because the FCC and telephone network grew up together. This reluctance is apparent in the many obstacles the FCC’s transition order places in the way of the decommissioning trials. While it is worthwhile to ensure that no essential capabilities are abandoned in the transition from the telephone network to its replacement (pervasive broadband networks running IP), it is important for the FCC to approach this transition sensibly.

Gigabit boondoggle unwinds in Chicago

[Commentary] The state of Illinois gave $2 million to Gigabit Squared to wire the Chicago South Side for ultra-high speed broadband, and now they want their money back. This story is more common than you might think, and Gigabit Squared isn’t the only offender.

From Burlington (VT) to Provo, Utah, it doesn’t matter whether a city is large or small, or if it goes it alone or bands together with neighbors: designing, building, and operating a broadband network is harder than running a water system. Very few municipalities have the expertise to do these things successfully, and those who leap before they look are likely to end up wishing they’d examined all possible outcomes before committing their cash. Mooresville (NC) had to lay off police, fire fighters, and teachers to cover their broadband debt. This is more likely to happen than not.

The main reason we can expect municipal networks to fail is revealed in their goal-setting exercise. It’s perfectly reasonable for some form of subsidized broadband program in an unserved or severely underserved area: people can gain great benefits from using a smartphone, an iPad, or a laptop to access the Internet, explore educational opportunities, or read the news. But you don’t need the world’s fastest network to do these things; all you really need is a garden-variety DSL, cable, or mobile network. And given those choices, most people will adopt mobile first. As history shows, the speed and capacity of broadband networks generally improves to meet and exceed customer demands.

[Bennett is a visiting fellow at AEI]

Hanging up the telephone network

[Commentary] The story of the transition from the telephone to the pervasive, broadband Internet as the primary means of electronic communication is one of conflict.

The principal means by which the Federal Communications Commission stands in the way of history is through its insistence that it has discovered a “Network Compact” consisting of “enduring values” embedded in the corpus of telecom law that magically pertain not just to the particular historical circumstances around the formation of America’s telephone network, but to all future networks as well.

Effectively, the FCC wants the terms of the Kingsbury Commitment of 1913 to constrain the growth of the Internet, lest something bad might happen. This posture necessarily prevents any number of good things from happening as well, or at least postpones them indefinitely.

The telephone network has been replaced by a multitude of options: the mobile network, wireline broadband networks, satellite services, and public Wi-Fi networks that are often free to use. While Americans have largely abandoned the telephone network, regulators who have invested careers in learning, interpreting, and applying telecommunications law are reluctant to let it go. Hidebound regulators and holdout citizens are the primary obstacles to the complete phase-out of the telephone network and the reallocation of its operational expense in more worthwhile alternatives.

While the Commission is clearly doing its best to serve the public interest as it sees it, it must do better. Preserving the technologies of the past has sentimental appeal, but it’s ultimately counter-productive to delay new technologies that are already broadly accepted and widely used. The expense of maintaining the old networks only retards the construction and use of new ones that are better in every dimension.

[Bennett is a visiting fellow at AEI, has a 30-year background in network engineering and standards]