nav search
Data Center Software Security Transformation DevOps Business Personal Tech Science Emergent Tech Bootnotes BOFH

EU urged to ignore net neutrality delusions, choose science instead

Establish facts before making broadband regulations? Might be an idea

By Martin Geddes, 27 Oct 2015

Guest Opinion There are real issue of power, fairness, justice and market transparency in today’s internet. There are real uncertainties over which market structures maximise social and economic benefits. There are real questions about the practicality of different traffic management and charging mechanisms.

But Europe's misguided "Save The Internet" campaign isn't the way forward. As a responsible professional and native European, I would like to summarise why it is imperative for EU regulators to ignore calls to “strengthen” net neutrality if they want to retain their legitimacy.

‘Neutral’ networks do not exist

No packet networks have ever been ‘neutral’, and none ever will be. The idea of ‘neutrality’ is not an objective and measurable phenomenon, as shown by the recent work published by Ofcom.

It is an invention of the legal classes attempting to force novel distributed computing services into a familiar carriage metaphor.

‘Neutrality’ has an evil twin, called ‘discrimination’. Asserting that internet packets are being “discriminated” against is a fundamental misunderstanding of the relationship between the intentional and operational semantics of broadband.

Neither concept is a term of the art of performance engineering or computer science.

No scheduling algorithm is ‘(non-)discriminatory’. The assumed intentionality of random processes is false. The idea of ‘defending’ neutrality is thus a pure intellectual nonsense. Regulators who attempt to legislate ‘neutral’ networks into existence will find themselves in collision with mathematical reality.  

Disconnected from actual constraints

Networks have resource constraints. One is capacity, and another is ‘schedulability’. The proposals to prevent ‘class-based discrimination’ fatally ignore the scheduling constraints of broadband. They require a cornucopia of resources (that don’t exist) to resolve all scheduling issues (which can’t happen) via an unbounded self-optimisation of networks (that is beyond magical).

Regulators who attempt to direct traffic management will find themselves sabotaging the customer experience and a sustainable cost structure. They will also be held accountable for the global QoE outcomes of their interventions at the level of local mechanisms. This won’t end well. There is no entitlement to performance.

Taking this issue further, discussions around ‘throttling’ or ‘slowing down’ implicitly assume that there is some kind of entitlement to good performance from ‘best effort’ broadband. Yet there is nothing ‘best’ or ‘effort’ about it.

The service’s performance is an emergent effect of stochastic processes. Performance is arbitrary, and potentially nondeterministic under load. Anything can happen, good or bad! That’s the ‘best effort’ deal with the performance devil.

That means that when disappointment happens (as it must), its effects are unmanaged. So how does unpredictable and arbitrary performance help the development of the market? It doesn’t. Given this dynamic, it seems perfectly reasonable for ISPs to bias the dice to ‘speed up’ apps whose performance lags, and ‘slow down’ ones who are being over-delivered resources.

Think of it as 'less arbitrary disappointment', rather than 'better effort'. Regulators who attempt to sustain the illusion of universal and perpetual entitlement to high quality at the price of low quality are in for a rough ride.  

‘Specialised’ services are an illusion

Every application has a performance constraint in order to be useful. Any attempt to define (and possibly restrict) the availability of predictable performance will hit three barriers.

Firstly, there cannot be an objective definition of ‘specialised’. It’s in the eye of the beholder. All my applications are ‘special’. Aren’t your digital children ‘special’, too? Secondly, applications are a form of speech, so you need to regulate classes of privileged speech, which hits both constitutional and human rights problems.

Thirdly, you assume that there are no legitimate ‘editorial’ decisions over the allocation of performance that ISPs can undertake. This is like saying to a newspaper that it cannot chose where to position its classified ads versus its news stories.

Regulators who try to create aristocratic classes of application, or insist all must be equal serfs, are dooming their population to performance misery.  

‘Fast lanes’ already exist, and guess what? They’re just fine

Application developers already buy CDNs to achieve higher performance at lower cost. This is seen as being a core feature of a workable net. Paid peering agreements with performance SLAs also exist. Non-IP telecoms services compete for users and usage with IP-based ones (eg ATM, MPLS, TDM).

So-called ‘fast lanes’ also aim for predictable performance, just at lower cost than other telecoms services. (We also need ‘slow lanes’ for predictable low cost, which may compete with the postal service.) The purported disaster that is promised is contradicted by decades of experience.

Indeed, the first ISP ‘fast lane’ was built to service the needs of the deaf for reliable sign language. Banning the ordinary development of broadband technology will mean these people are left with a simple choice: go without, or buy an expensive non-IP telecoms service to get the timing characteristics you need. Banning ‘fast lanes’ visibly harms users.

Regulators risk ridicule if they strongly regulate pricing of services with assured timing characteristics based on which transport protocol they are using.

‘Congestion’ discussions are Not Even Wrong

The ideas of ‘congestion’ (whether ‘imminent’ or not) profoundly misses the point and reality of packet networks. The raison d’être of packet networks is to statistically share a resource at the expense of allowing (instantaneous) contention. Networks safely run in saturation are a good thing.

In other words, we would ideally like to be able to have as much contention as possible, to lower costs, as long as we can schedule it to deliver good enough user experiences. The discussions offered around ‘congestion’ are beyond irrelevant, they are simply meaningless. Genuinely, they fall into the category of ‘not even wrong’.

It’s like playing chess against a pigeon. Regulators face a simple choice: either there is a rational market pricing for quality (that developers must participate in), or there is rationing of quality. Which one do you want?

A broken theory

The underlying theory of ‘net neutrality’ advocates a virtuous cycle of innovation. The more users there are, the more applications get written, which drives more users. The leap is then made to to ‘neutrality’. This Utopian ideal (single class of service, ‘best effort’, users pay all performance costs) supposedly maximises the flywheel effect.

The presumptive basis is to minimise risk and cost to developers, and maximise choice for users. This theory is flawed in five key ways:

  • Is assumes applications get the predictable performance they need. We can be sure that many applications don’t exist today because the performance of ‘best effort’ is unpredictable, so by definition they aren’t written and don’t get traction.
  • It assumes that all users and developers are internalising their costs. They are not. Many applications are effectively pollution of a shared resource, and protocols are aggressively fighting for finite resources.
  • It assumes there is no cost of association. A flat global address space where everything is reachable may sound attractive, but it comes with non-zero security and routing costs.
  • It assumes that developers are entitled to write distributed applications with no engineering costs for performance (eg issuing profiles to DPI vendors, marking traffic). This is delusional.
  • It assumed there is a mechanism for users to configure performance directly when needed. Today, that is absent. Regulators that attempt to sustain today’s mispricing of performance will find their rules incentivise a mis-allocation of resources, open up market arbitrages, and repel capital from the telecoms industry.

What should Europe do? Ignore the lawgeneers, and be scientific

The FCC went ahead and made rules about ‘net neutrality’ without getting its technical house in order first. This was done at the behest of cohorts of well-funded lobbying lawyers.

As a result it has put at risk the FCC’s credibility, since those rules are in conflict with the technical and economic reality of broadband. The article cited here is merely an exemplar of a sizable body of academic literature on ‘net neutrality’.

This literature exists in a self-referential citation bubble disconnected from actual broadband network operation. A common failing is to call for ‘faster than math’ packet scheduling.

This does our industry and society a disservice, and harms the credibility of the institutions whose names are attached to these works. Their authors' misguided attempts to control the definition and direction of ISP services must be resisted.

I strongly urge European regulators to ignore these campaigning ‘lawgeneers’. They have no ‘skin in the game’, so suffer no consequences for their pronouncements based on false technical assumptions.

This is a form of ‘moral hazard’. At least ISPs have a stake in the long-term viability of their services. The way forward is for regulators to establish a solid body of scientific knowledge within which the necessary debates can occur.

This needs to be done by stochastics experts and computer scientists, not lawyers. The one (and only) thing that should be ‘neutral’ is the resulting framework in which a debate over justice and fairness is held.

In particular, broadband has performance and cost constraints. So what are they? We can then have a policy debate that sits within those constraints, just as spectrum policy respects the laws of physics and electromagnetism.

Ofcom has laudably made such a move to establish a basis of scientific fact from which to make broadband regulations. It has cleanly separated the science and policy issues. This process needs to continue and spread.

If you would like to join a movement for reality-based regulation, please do feel free to get in touch to discuss how this might be brought about.

A version of this article appeared at Martin's blog, and is reprinted with permission.

The Register - Independent news and views for the tech community. Part of Situation Publishing