Nick Carr's Big Switch
A computer revolution - or the next Fail?
Review Nick Carr's weblog is one of the rarest things on the web: intelligent technology criticism that you'd actually want to read for pleasure. He's an elegant writer with a waspish wit, and I've a special reason for seeing him prosper.
Back in 2002 I was living in San Francisco, a city that was in the depths of recession, when I first noticed the stirrings of the next wave of hype. Hope springs eternal, they say, and the Bay Area's unemployed web monkeys, technology prophets, and a gaggle of marketing and marketing consultants - who had all been having a jolly good time until quite recently - began to figure out how to construct the next bandwagon.
The result is another web mania gripping the media. This one isn't quite like it's predecessor, however.
For a start, it's much more limited in scope. It's rhetorical, rather than economic. While the original dot.com bubble will always be remembered one of the biggest losses of wealth in human history, prompting ordinary investors to plunge their life savings into worthless stocks, the new web hype has been a much more modest affair. This time the asset bubble is property, not technology, and most internet users have simply carried on as before, happy to swap dial-up for broadband in the quest for idle chatter, free music and porn.
The "Web 2.0" affliction of has so far only infected the media and political classes, with isolated outbreaks in marketing and the social sciences. (Naturally, you'd expect something created by ad consultants to hit ad consultants hard, but I didn't expect the London media to fall for it the hardest.) But where it strikes, it seems to take over the unfortunate victim's entire brain; and that's still a lot of people with public policy influence. The zombie symptoms of the virus we all know today: gibbering about "new democracy", "wise crowds", and the rational faculties of a three year-old.
For three years I found myself the only journalist chronicling such phenomena as the new democracy that wasn't, or the paradigm-shifting business revolution that couldn't make money, or the global intelligence that was easily outwitted by trinket salesmen, or the encyclopedia that destroyed Universities. This was the Dawn of a New Punditocracy.
I fortunately had lots of help from readers, who've coined many of the pithiest descriptions of the web bubble. Lots and lots of help. The Reg readership includes a lot of people who implement technology, and then have to keep the systems running - and the distaste is quite visceral. (Most of you have rumbled quite early on that this web hype was presentation layer people trying solve system level problems, all the while hiding behind a lot of New Age marketing guff).
Pointing this out made me hugely unpopular with a small number of people (who'd figured out that these tools and processes could so easily game the media, promote their agenda) who naturally resented the lid being lifted. But this all-sweeping utopianism needed many more hands to pry apart. For the past two years, Nick Carr's RoughType blog has done that job with style to spare.
The Big Switch is Carr's second book, and is the first serious examination of "Web 2.0" in book form. It's a good companion to Steve Talbott and Clifford Stoll, lone critics of the first dot.com hype, and beautifully written. Its target market is people who don't read the blog (or El Reg, probably), and so are in particular need of a sanity check.
But The Big Switch is deeply frustrating, however, because it's really two parts of larger books.
The first part takes the form of a literary conceit: it's an extended essay comparing the build out of the "world wild computer" to the commercialization of electricity, and the creation of the electric grid. The rest of the book looks at the social and economic consequences of this ubiquitous computing facility - and that's where much of the best insight lies.
The Electric Carr
The first thing you'll notice is that the two part-books don't really need each other. By Chapter 7, when the second book gets underway, you'll be wishing that it was much longer and involved. It's in this part of the book that Carr discusses the consequences of the web that are generally overlooked. We know from RoughType that Carr has an impish eye for the revealing detail - with other authors you wish they'd done the research, or drawn the connections, but from RoughType we already know he has.
But getting to Part Two first involves negotiating Nick's Edison analogy. How well does this hold up? Carr reminds us that the creation of the electric grid created new industries such as sound recordings; it lit cities, it made possible refrigeration; new cities were created from the inventions of electricity. It was a dramatic chapter in the industrial revolution. Could the ubiquitous computer grid do the same, asks Carr, before predicting:
"The future of computing belongs to the new utilitarians," he writes.
But the metaphor doesn't work if you view digitization as simply the latest chapter in the "electricity revolution", just as electricity was part of the industrial revolution. We expect too much from a metaphor if we ask it to stand alone. For example, we already think it almost quaint that music came in physical containers - rather than on tap. It's possible to argue that this was a historical blip. But the mass production of sound recordings, and the low cost sound reproduction devices, began with electricity. You can argue that there's a continuum here, rather than a history punctuated by revolutions.
Then there are more practical objections to this all-sweeping metaphor.
One of the consequences of ubiquitous computing is that there's simply so much more stuff to do things with, and for the first time in a decade I've found myself wishing for greater local CPU horsepower and bus bandwidth. I'm no musician or film maker, but a mere civilian who finds processing digital media cumbersome and slow. There's a limit to what we can do with the service bureau model of computing, even at its most optimal, because there's so much we want to do that's better performed locally - even it makes less sense to do so economically.
Carr is right when he points out that twenty-somethings view web services such as Gmail, Flickr and Facebook as utterly natural; and Salesforce is a great success story of a bureau service being adopted in the workplace. But this low-hanging fruit may be all the harvest basket ever sees. It makes lots of sense to put computing in the cloud, just as it makes lots of sense to put people into mass transit - yet the demand for cars simply continues to rise - they offer real value. So while Carr writes that "the Switch" will take many years, the book is short on suggestions on what the cloud will absorb next, and the sweeping historical metaphor is left to do the donkey work.
More recently, it's also become apparent that this, the "2.0" version of the cloud has severe limitations: and here we're back to those presentation layer people trying to do system level tasks, again.
The website Uncov gives regular updates on the sheer ineptness of this, the Web 2.0/AJAX version of client/server computing - and it's hard not to conclude that we're many years away from solving some of the basic technical problems.
"A 1GHz Athlon can barely handle text editing in this new Web 2.0 thingy," asks Ted Dziuba in this piece. "Maybe I need a faster computer?"
Uncov: Cloud computing delayed by poor programming, design?
We have no shortage of web-based office suites today, but the gap between reality and aspiration is huge, and it needs some explaining. When asked about the progress of his artificial intelligence (AI) work, MIT's Marvin Minsky used to suggest that only a few tweaks - a nut tightened here, or there - were required before success could be declared. Web 2.0 is in a similar pitiful state, and needs a bit more than new tools or languages - a cultural change is required in the class of people who tend to the stoves. (My suggestion would be to shoot the web jockeys, and employ some people with physics or computer science backgrounds.)
Again, this isn't addressed - and the author must fall back on the Big Switch metaphor again.
In fact, Google's huge datacenter build out may have more to do with creating and controlling the middle tier, becoming an uber-Akamai (it's cannily thrown its weight behind the "Net Neutrality" campaign, which seeks to hamper its telcoms rivals from doing the same thing), rather than software as a service.
Then there are the things we naturally recoil from - security and privacy fears raised by ubiquitous data sharing. Carr describes these very ably in Part Two - for example, noting the widespread revulsion at AOL's search data being distributed as research material. But because he sees the cloud as the future, this leads him to fatalistic conclusions that (fortunately) aren't always the most logical or likely outcome. If there was a Society for the Prevention of Cruelty to Metaphors, it might have intervened with a petition.
Fortunately that's only a third of the book, and after hearing from Nick the Futurist (Nick1), Nick the Journalist (Nick 2) picks up the baton.
Evangelists and the net deity
Jarvis is a former soap critic for a TV listings magazine who has exploited the business of striking fear into media companies better than most. Thanks to "Web 2.0", Jarvis is now considered a media guru, and the BBC and Sky pay handsomely for his advice. For Carr, he's simply the "blogosphere's resident philistine". 'Nuff said.
Carr identifies what these utopians have in common quite clearly - and it's a pseudo-religion: the final chapter is called iGod. He's excellent at pointing out some of the consequences of technology the utopians ignore, such as the body count. Self-styled "revolutionary" utopians always brush aside the consequences of their advice: the means justify the ends.
The web prophets invariably ignore the sheer hopelessness of today's internet for sustaining creative business. This is a deep structural problem: because everyone can get hold of anything in this anarchy, there's none of the scarcity provided by a limited choice of TV channels or movie theatres - and scarcity creates economic incentives for both distributors and creators. Yet for the utopians, some business "model" will pop up and in act of deus ex machina, save the day.
For Carr, correctly, this just isn't good enough. He eschews bluster and his cool analytical approach pays dividends in these passages. He also points to "The Great Unbundling" that's the result of "me media".
A recent scoop here came too late for inclusion, but adds weight to this argument. A cross-section of the music business commissioned Capgemini to examine why the value of music had fallen - when music had never been so popular. The analysts suggested that the largest single factor in the shortfall was unbundling (along with the entry of the supermarkets) - rather than P2P file sharing.
Of course, this doesn't make uncompensated music consumption right or fair - because of P2P, one never has to pay musicians for their music again - but it does show how the destruction of value is more subtle than many people appreciate.
And from here Carr joins the dots, pointing out that when making money is hard, market consolidation results, with power ending up in far fewer hands than it did before. That's a real heresy that the childlike minds of the web evangelists simply can't handle. They're really only in it to take part in a simulation of a revolution.
(Paradoxically, the example above suggests that music can only really be "sold" as a service - strengthening the book's view that this is a new kind of service industry).
The future hasn't been written
Robert Kahn, the "father of the internet", likes to point out that packet networking is still in its infancy, and there's nothing inevitable about the direction of its development. Unfortunately, in making a teleological argument in Part One, Carr rather ties his own hands when looking it comes to looking at the consequences. The result is two Nicks battling to steer the argument their way. Nick1 has seen the future, but Nick2 begs to differ.
For example, Nick1 writes:
"The internet is not simply a passive machine, it's a thinking machine of a sort - a rudimentary one, that actively collects and analyzes our thoughts and desires."
"Figuring out new ways for people - and machines - to tap into the storehouse of intelligence is likely to be the central enterprise of the future [our emphasis]".
But as Nick2 has already suggested, this is not intelligence, merely the detritus of what's left behind.
"Surfing the web captures the essential superficiality of our relationship with the information we find in such quantities on the internet," he writes.
Carr knows that the patterns created by the semi-conscious web surfer aren't worth a diddle. On RoughType, he's explained that it's the boundaries between things are what give them value. He's been a devastating critic of Kevin Kelly's enthusiasm for the web's "liquid fabric", as opposed to plain old books.
Similarly, when Nick1 suggests:
"In Google search and [Amazon's] Mechanical Turk - we are beginning to see the human mind merging into the artificial mind of the world wide computer."
We know that's a far-fetched claim, because Nick2 has already pointed out that this is neither "mind" nor "mindful". The superficial relationship with the internet means that Wikipedia - a simulation of an encyclopaedia - is only really harmful when we believe it. So the data mining will pay dividends only when you're selling superficiality - Chuck Norris jokes, for example - and is unlikely to be of much use to anyone else. Marketeers, beware.
A fatalism also creeps into some of the conclusions. Carr suggests the internet is an instrument of control, leading to a quite dystopian conclusion: that as we're building a machine that watches us, he argues, we're becoming more mechanical.
Again, that's a fascinating argument to pursue, but the idea isn't developed. History suggests we need not be so pessimistic. Systems of control tend to be undone by our desire not to play a mechanical or reductive part in them. We always tear up the script in the end.
Religions have succeeded because they return something meaningful to the believer. But if technology utopianism is a quasi-religion, as Carr shrewdly suggests, then it's a pretty lousy one. Like a real religion, this iGod demands endless sacrifice (constant attention to one's "reputation" or Facebook profile, or giving up the right to be paid if you're a creative artist). But in place of salvation or karma, iGod merely offers enlightenment through selfishness, and comes with a a really unattractive eschatology - the "singularity". Without strong bonds, people can slip away from this iGod very easily.
(What a pity that the great academic computing folly of our age, Tim Berners Lee's Semantic Web, receives no critical commentary at all. And in their war on copyright, the utopians have created, in their imaginations, an all-embracing demon as powerful as anything in Medieval literature).
It might be harsh, but not entirely unfair, to say that Carr has allowed himself to be swayed by something he is usually so good at skewering, the historical inevitability argument. It suits the Web evangelists to argue that we're on the brink of a revolution - it drives up their fees. There's no need for a critic to feel quite so constrained.
So of the two mini-books in The Big Switch, one leaves many unanswered questions, while you wish the other went on for much longer. That's not bad, though, and it certainly isn't fatal.
In a few years, academics will be able to look back and compare The Big Switch with say, Chris "Long Tail" Anderson's next book. I'm sure they'll be tickled by the comparison. The WiReD editor's forthcoming offering is a manual for desperados called Free. "Round down!" urges Anderson, "if the unitary cost of something is approaching zero, treat it as zero and sell something else."
It isn't hard to tell which of these two authors is trying to sell you a failure. With their economics broken, and their revolution liberating no one, the old WiReD-era evangelists have nothing left to say, except to rhapsodize failure. If only lemmings could read, then Anderson would be the King of Norway.
And anyway - how many other technology books can you think of that leave the audience wanting more?
(I have one suggestion for Carr and his publisher. There's a precedent. Nick's already written his, and a selection of the 'best of' from RoughType would make an excellent companion). ®