FCC won’t force all of Title II “net neutrality” regulations on carriers, but “we don’t know where things go next”

21comments
This article may contain personal views and opinion from the author.

A lot of people and a lot of businesses were cheering the FCC’s predictable 3-to-2 vote to classify the internet as a utility, and thus apply Title II regulations which govern traditional telecommunications services. The dawn of net neutrality is at hand, or so say the proponents.

I can forgive a lot of the younger folks not understanding what Title II enables the FCC to do. These regulations were written and implemented in the 1930s, signed into law as the Communications Act of 1934 when telephones were the leading edge of consumer technology. That was only 81 years ago, surely those provisions will be an easy application to 21st century technology where a lot people do not know what a dial-tone is, or a step-switch, or a twisted pair.

The older folks though? Do you really need to be reminded of the absolute tangle traditional telecom services are to this day due to the absolute maze of federal and state regulations? You know, those rules that dictate how service is provided, where service may be provided, when service may be provided, what fees are attached, and how much the service itself will cost?

What could possibly go wrong? All data will be equal, that’s good right?

300 pages


I get it, it sounds good, it “feels” good. Joe Q. Public doesn’t have to worry about his YouTube feed or Netflix movies getting buffered or potentially blocked. The problem is that no one knows for sure how it will play out, not even the FCC. I can tell you this though, the internet has been open and unhindered by nearly all regulation in the free world, and it has ushered an unprecedented era of economic growth, technological innovation, generated unheard-of amounts of personal wealth, and unbridled levels communication. It is a shining example of what could be called the ultimate open-source project.

The arguments for net neutrality aren't technical, and they aren't even business arguments. Yet, they espouse this notion of “free and open” for all. If that is really the case, it shouldn’t take 300 pages of regulations to “help” a system that isn’t broken. Of course, no one but the FCC knows what those regulations are, because they were secret before the vote, and given the modus operandi in Washington, DC, I would be surprised if all five of the voting commissioners actually read them. The rules will hopefully be published in the early part of this week.

After the vote, FCC Chairman Tom Wheeler said in a press conference that the regulatory body was looking at what he termed as “bright line” issues (more on the non-bright line stuff later) – blocking, throttling, and paid prioritization, “…we know about those issues. But we don’t know where things go next.” (emphasis added)

The good and bad from the past


I don’t need to know where “things go next” based on where things have been. Are you familiar with “plain old telephone service?” Otherwise known as POTS, this basic type of service falls under Title II regulations. These rules empower the FCC, and the FCC uses that power with a heavy hand to this day (81 years later), to dictate pricing, availability, and “just and reasonable practices” of this nation’s telephone system.

At the time when Title II was implemented, there was one major phone company, Ma Bell (the old AT&T), and a few regional providers like GTE (General Telephone). In the lead up to, and aftermath of World War II, one of the major priorities was to ensure equal access of service as the country grew, not that there was a lack of demand for service, but in the face of a then-yet undissolved monopoly, the government laid down some ground rules to ultimately ensure that whenever and wherever POTS was needed, it would be provided. That is still the law of the land today, and it is not unreasonable given how the technology operates.

How the telephone is used by business is heavily regulated. Newer rules put a welcome clamp down on telemarketers, and I am certain that very few people are upset about that. With every ultra-tight rule comes leakage though, and anyone that has a hardline phone knows that auto-dialers spoof caller ID systems and call people, even if they are on the official “Do Not Call” list.

These rules for POTS do not change very often because use of facilities based services have plateaued or are on the decline. Prices and features must be filed with the FCC and a state’s public utilities agency for review before they can be formally implemented. Those same rules go beyond consumer services, they also play a role in the back-end services businesses use. The vote now casts what was once a quick-to-respond to market conditions industry on to rails that must now pass scrutiny with another master, as interconnection agreements between providers will now be regulated, and that includes mobile.

Title II has not been rewritten


The FCC says it will not exert many restrictions Title II allows upon the internet providers and wireless carriers, but therein lies the rub. The FCC did not, and cannot re-write Title II – only Congress can do that, and given the impact on interstate commerce this ruling is sure to have, Congress is sure to make a fuss since it is deciding body on matters of interstate commerce. The fact that the FCC says it will not enforce certain rules or guidelines related to Title II should not give anyone a warm and fuzzy feeling – case and point, when was the last time you believed a politician?

I won’t go on that rant (though I’m dying to), instead I will point out what all this may mean for the consumer. The possibilities of what we face are not guaranteed because remember, “we don’t know where things go next.” First, one must ask, should certain types of data have priority? The answer is an absolute, “yes” and I will start with fundamental features that are just becoming available, or are fairly recent additions to wireless service.

Why data needs to be prioritized


Do you enjoy HD voice on your carrier thanks to Voice-over-LTE? Do you know why the audio is so clear, and overall a higher quality? Because VoLTE is nothing more than Voice-over-Internet Protocol (VoIP). VoIP is a method of transporting voice calls over a data network, rather than via a circuit switched call. In today’s reality, most, if not all of our traditional calls have some type of IP backhaul, but even then this matter is still relevant.

In the past, VoIP was commonly associated with quality of service (QoS) problems. Part of the reason for that was the networks did not always lend proper priority to the voice packets of data, so the result would sometimes sound like jittering, clipping, or lost bits of audio altogether. The reason why VoIP works so well nowadays is because the protocol often overprovisions network capacity and ensures the voice traffic has priority. In other words, the voice traffic is given plenty of space and green lights along its path. The result is crystal clear calls.

I’m not talking about massive amounts of bandwidth, but that doesn’t matter anymore. Under net neutrality, all data will simply get in line, and it will move as fast as it can based on the available bandwidth in a given pipeline. The network owners will not be permitted to throttle some packets for sake of another based on content. So, if some of your favorite services degrade due to what used to be easily managed congestion periods, don’t fret, you got equal data for all bro!

How about Facetime or Skype Video calls? You like those don’t you? Well giving those applications priority through traditional ICMP (Internet Control Message Protocol) routing methods won’t mean as much since the network owners cannot prioritize that traffic either, but don’t worry, at least that email didn’t have to wait an extra few milliseconds.

The question of a flood of ultra-high bandwidth content


In my previous example, I only focused on voice and peer-to-peer type video, and I know there are limits to the ubiquity of those applications, not everyone uses Skype or Facetime, and voice activity has been stagnant. Given that, one might argue that our networks have the capacity to make all this stuff work. That could be true for right now, but know this, streaming a 1440p YouTube video on your quad-HD smartphone requires about 6-10Mbps of throughput. Ultra-HD 4K streaming content requires about 15-20Mbps.

Since all data is to be “equal,” all that is needed is for a new service to start offering something extravagant, like a “4K-Only Video Flicks” type service. With proper marketing and a guaranteed low-information consumer base that just wants to see what 4K video is like, the data requests to stream this content en masse could choke a service node in no time (never mind the fact that very few people have gear capable of actually processing the stream), as the throughput requirements are 10-times greater than 1080p video. The opportunity is there, and someone is madly hammering out code to launch just such a portal because all the data is equal.

Traditional 1080p needs only about 2Mbps when properly managed and prioritized. We are on the crux of a lot of advanced content being widely available. The consumer hardware and content is walking hand-in-hand, but our networks are not and “we don’t know where things go next.”

Implications for mobile – read between the lines


The FCC says it will not get into pricing permissions with the carriers over new plans that Title II empowers the regulator to do, if it so chooses. Whatever is not deemed a “bright line” item, the FCC will apply a “standard for future conduct,” whatever that means. To that point, let us look again at POTS. I still maintain a POTS line because I’m just old-school like that and its reliability is still unmatched in my opinion. The price of traditional telephone service has not gone up or down by any appreciable amount in decades. Why is that? Because the telephone companies have to file tariffs with the state and the FCC before changes to prices or features can be made.

Julie Veach is head of the FCC’s Wireline Competition Bureau, and for the wireless carrier concerns she says, “There is no requirement to come to the FCC to seek approval. There are opportunities to do so if a provider wants to.” Why might T-Mobile or Verizon want to talk to the FCC about a new price plan? Well under the ambiguous “standard of future conduct” concept, the FCC could offer an “opinion” about a plan and whether it meets that conduct clause.

“There are opportunities to do so...” In any given scenario, these regulations will increase the cost of doing business, and that means either slowing the rate of growth to maintain capital expenditures, or cut a business’ most expensive asset (or liability) item: the employee.

You are probably thinking that could be a good thing, a carrier arguably won’t go to the FCC with a rate increase proposal. What you overlook, however, is that the carrier is not incented to go to the FCC with a plan to lower rates either. Why would they? They can keep their margins where they are, slowly (or suddenly) bow out of two-year contracts, maybe tighten up the equipment financing options, and leave things where they are. If another provider tries to get into a low-ball price war, the competition can simply file a grievance declaring “unreasonable advantage” or whatever language is chosen to cite the “future conduct” standard – but hey, equal data.

Of course, none of that matters because according to the FCC’s general counsel, Jonathan Sallet, the regulator could just “decide on its own” that a new plan fails to meet the future conduct standard. Future practices and standards will be judged on a case-by-case basis. So again, why would a carrier offer a new plan ever again?

The implications of this vote are far more reaching than anyone is giving credit, and frankly, I’m astonished at the lack of vision from the tech corridor. “Utilities” are the most regulated entities in the country. The “feel good” types that are cheering, “Yay! All data is equal, because, yeah man, data, it needs to be equal,” are not thinking things through. None of this has to do with your personal data use, but it may very well impact that use because “we don’t know where things go next.”

All services are on the table now


Title II also regulates services, meaning that a company that makes commercial use of a “utility” is free game to fall under scrutiny. Remember, the FCC cannot rewrite Title II, so it has a de facto open door to play a role in the services we use on these data networks, including Google, Facebook, Twitter, your favorite streaming radio service, your email, et al. Don’t believe me? Look up the “fairness doctrine” from the days of radio prior to the 1980s.

Cautious proponents of net neutrality, like Google, have a new headache to worry about now. Google Fiber is now a "utility" and will become a prime target of other utilities. All it will take are formal complaints that the search engine is “unfairly” leveraging ad revenue from other commercialized services that use other parts of the internet (a utility) to suppress prices for services in select markets. Google has flourished in an open and unregulated internet, and while I know Larry Page and the gang are doing great things, they are better off with Tier 1 providers on their side, than against. Facebook, Twitter, Yahoo!, TV programming delivered over the internet, et al, can all be shepherded under Title II in some way.

“Zero-rate” type services are not currently on the FCC’s radar, so AT&T's sponsored data program, and T-Mobile customers that like to stream music for free do not have anything to be worried about, at least until someone changes their mind. Roger Sherman heads up the FCC’s Wireless Telecommunications Bureau and says it “would be premature” to state if “certain future plans” would be permitted or not. That means sponsored data may not be permitted, and perhaps other “zero-rate” feature initiatives will go under a bureaucrat’s microscope first. "Would be premature," if that does not sound like a bureaucrat looking to sink his teeth into something in order to "help," I don't know what is.

The reason why this matters to wireless, and to wireless consumers, is that mobile providers have fast become a person's primary access point to the internet and all the services available on a mobile device. Besides, once the signal hits a tower, it’s all riding on fiber and copper anyway. Those that argue American broadband initiatives have not kept pace with counterparts in parts of Europe and Japan are overlooking the size of the American continent, and what it takes to build and maintain connectivity on such a vast scale in an area with a fraction of the population density. A mile of fiber in the US serves far fewer people than a mile of fiber pretty much anywhere in Western Europe or Japan. Placing a mile of double-strand fiber costs roughly $30,000 to $50,000, and that still does not account for all variables. Take Manhattan, New York City, it is a little over 13 miles long. You can bet that there is more than a few hundred pairs of fiber running through that borough, but just 100 pairs would cost a conservative $50 million to install (maintenance is another issue). Scale that investment, and it represents billions of dollars of infrastructure. Believe me, the providers want you using the net.

The 300 pages of rules are expected to be published early this week, but I say again, so what? This vote on Title II is an invocation of ad hoc law and ad hoc regulation of an industry that has flourished without such constraints.

As long as the carriers, and the services we use abide by some abiguous “future conduct” rule, I guess we’re good to go right? Then again, to coin the low-confidence-inspiring words so aptly uttered by FCC Chairman Tom Wheeler, “We don’t know where things go next.”

references: FierceWireless and The Wall Street Journal

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless