This article is more than 1 year old

EU urged to ignore net neutrality delusions, choose science instead

Establish facts before making broadband regulations? Might be an idea

Guest Opinion There are real issue of power, fairness, justice and market transparency in today’s internet. There are real uncertainties over which market structures maximise social and economic benefits. There are real questions about the practicality of different traffic management and charging mechanisms.

But Europe's misguided "Save The Internet" campaign isn't the way forward. As a responsible professional and native European, I would like to summarise why it is imperative for EU regulators to ignore calls to “strengthen” net neutrality if they want to retain their legitimacy.

‘Neutral’ networks do not exist

No packet networks have ever been ‘neutral’, and none ever will be. The idea of ‘neutrality’ is not an objective and measurable phenomenon, as shown by the recent work published by Ofcom.

It is an invention of the legal classes attempting to force novel distributed computing services into a familiar carriage metaphor.

‘Neutrality’ has an evil twin, called ‘discrimination’. Asserting that internet packets are being “discriminated” against is a fundamental misunderstanding of the relationship between the intentional and operational semantics of broadband.

Neither concept is a term of the art of performance engineering or computer science.

No scheduling algorithm is ‘(non-)discriminatory’. The assumed intentionality of random processes is false. The idea of ‘defending’ neutrality is thus a pure intellectual nonsense. Regulators who attempt to legislate ‘neutral’ networks into existence will find themselves in collision with mathematical reality.  

Disconnected from actual constraints

Networks have resource constraints. One is capacity, and another is ‘schedulability’. The proposals to prevent ‘class-based discrimination’ fatally ignore the scheduling constraints of broadband. They require a cornucopia of resources (that don’t exist) to resolve all scheduling issues (which can’t happen) via an unbounded self-optimisation of networks (that is beyond magical).

Regulators who attempt to direct traffic management will find themselves sabotaging the customer experience and a sustainable cost structure. They will also be held accountable for the global QoE outcomes of their interventions at the level of local mechanisms. This won’t end well. There is no entitlement to performance.

Taking this issue further, discussions around ‘throttling’ or ‘slowing down’ implicitly assume that there is some kind of entitlement to good performance from ‘best effort’ broadband. Yet there is nothing ‘best’ or ‘effort’ about it.

The service’s performance is an emergent effect of stochastic processes. Performance is arbitrary, and potentially nondeterministic under load. Anything can happen, good or bad! That’s the ‘best effort’ deal with the performance devil.

That means that when disappointment happens (as it must), its effects are unmanaged. So how does unpredictable and arbitrary performance help the development of the market? It doesn’t. Given this dynamic, it seems perfectly reasonable for ISPs to bias the dice to ‘speed up’ apps whose performance lags, and ‘slow down’ ones who are being over-delivered resources.

Think of it as 'less arbitrary disappointment', rather than 'better effort'. Regulators who attempt to sustain the illusion of universal and perpetual entitlement to high quality at the price of low quality are in for a rough ride.  

‘Specialised’ services are an illusion

Every application has a performance constraint in order to be useful. Any attempt to define (and possibly restrict) the availability of predictable performance will hit three barriers.

Firstly, there cannot be an objective definition of ‘specialised’. It’s in the eye of the beholder. All my applications are ‘special’. Aren’t your digital children ‘special’, too? Secondly, applications are a form of speech, so you need to regulate classes of privileged speech, which hits both constitutional and human rights problems.

Thirdly, you assume that there are no legitimate ‘editorial’ decisions over the allocation of performance that ISPs can undertake. This is like saying to a newspaper that it cannot chose where to position its classified ads versus its news stories.

Regulators who try to create aristocratic classes of application, or insist all must be equal serfs, are dooming their population to performance misery.  

‘Fast lanes’ already exist, and guess what? They’re just fine

Application developers already buy CDNs to achieve higher performance at lower cost. This is seen as being a core feature of a workable net. Paid peering agreements with performance SLAs also exist. Non-IP telecoms services compete for users and usage with IP-based ones (eg ATM, MPLS, TDM).

So-called ‘fast lanes’ also aim for predictable performance, just at lower cost than other telecoms services. (We also need ‘slow lanes’ for predictable low cost, which may compete with the postal service.) The purported disaster that is promised is contradicted by decades of experience.

Indeed, the first ISP ‘fast lane’ was built to service the needs of the deaf for reliable sign language. Banning the ordinary development of broadband technology will mean these people are left with a simple choice: go without, or buy an expensive non-IP telecoms service to get the timing characteristics you need. Banning ‘fast lanes’ visibly harms users.

Regulators risk ridicule if they strongly regulate pricing of services with assured timing characteristics based on which transport protocol they are using.

‘Congestion’ discussions are Not Even Wrong

The ideas of ‘congestion’ (whether ‘imminent’ or not) profoundly misses the point and reality of packet networks. The raison d’être of packet networks is to statistically share a resource at the expense of allowing (instantaneous) contention. Networks safely run in saturation are a good thing.

In other words, we would ideally like to be able to have as much contention as possible, to lower costs, as long as we can schedule it to deliver good enough user experiences. The discussions offered around ‘congestion’ are beyond irrelevant, they are simply meaningless. Genuinely, they fall into the category of ‘not even wrong’.

It’s like playing chess against a pigeon. Regulators face a simple choice: either there is a rational market pricing for quality (that developers must participate in), or there is rationing of quality. Which one do you want?

Next page: A broken theory

More about

TIP US OFF

Send us news


Other stories you might like