Singapore invests in TIA snake oil
Poindexter's dream comes true
Comment Retired US Admiral and convicted felon John Poindexter has been a busy man since Congress scrapped his Total Information Awareness (TIA) system and punctured his Orwellian dream of linking every government database imaginable in pursuit of evildoers, as Wired News reports.
Indeed, the Iran-Contra scandal alum has got a seat on the board of BrightPlanet, a data extraction and mining tools provider, and has just been speaking in Singapore, where officials welcomed him to the unveiling of his TIA counterterrorist brainchild, finally sold to an unsuspecting government.
The Singaporean version of TIA, currently a working prototype scheduled for deployment in the Fall, is called Risk Assessment and Horizon Scanning (RAHS). Its development has been influenced by two former consultants for Poindy's TIA system, John Peterson of the Arlington Institute, and Dave Snowden of Cognitive Edge.
Peterson bears the marks of a techno-utopian who might be harmless if he weren't involved in something this serious. His mission statement says it all, in words we have heard a thousand times, from a thousand tech-enthused lotus eaters. He speaks of an "exponential increase of human knowledge, and the acceleration of its application through technology", which is "propelling humanity towards a new era of thought and endeavour". He believes that "we are living in an era of global transition, to a degree that our species has never seen before."
An "exponential increase of human knowledge" indeed. Noise, not knowledge, is the thing that information technology is helping to increase exponentially. Instant messaging, email, social networking, BlackBerries, mobile phones, cable television, satellite radio...noise, noise, noise. Everyone's got something to say, and everyone is saying it. But still no cure for cancer, as they like to point out at Fark.com.
Snowden appears more pragmatic, displaying prominently on his website the model Poindexter used to impress the masses at the RAHS Symposium, and taking time to disagree with it to some extent. Although, like any good pragmatist, Snowden reckons there might be some useful bits in what Poindy had to say on the subject of data mining and he intends to evaluate those bits with greater care.
Point-and-drool national security
We have heard a technologically illiterate mainstream press agonising over the notion that major terrorist attacks in the West could have been prevented if only our tech-savvy national security geeks had "connected the dots" in time. If only they had been able to separate the signal from all that noise.
Of course, all the data needed to detect the impending attacks in New York, Madrid, and London existed, but there has been no end to speculation by bureaucrats, legislators, and naive journalists that these atrocities could have been prevented if only the right sort of information technology had been in place at the time. If only some magic filter could have sifted through the noise and saved those people's lives.
Such a push button solution is what everyone wants, so it, or rather an illusion of it, is what companies like BrightPlanet, Cognitive Edge and scores of others are selling. "Instead of having analysts trawl through huge amounts of data to decide what it means, the data is tagged very quickly, then they decide what the patterns in the metadata mean," Cognitive Edge's Snowden explains to Wired.
The idea of some massively-complex "system" silently beavering away, sifting and gathering and analysing disparate bits of data from the vast ocean of noise surrounding us has an appeal that is universal and everlasting. A silent guardian, a tireless mechanical brain; a slave intelligence without distractions like hunger and thirst and sexual urges, immune to fear, anger, laziness, or pettiness. What a wonderful world it would be.
But it's Peterson, not surprisingly, who provides the money quote, feeding the dream with words that every tech believer longs to hear. "Essentially, [RAHS is] a strategic tool that ties together every one of the agencies in a government into a large network that is constantly scanning the horizon looking for weak signals that point toward the possibility of a significant event that would have important implications for Singapore," Wired quotes him as saying.
There are other, more meaningful ways to talk about what Peterson means by his exceptionally vague phrase, "weak signals". Vague phrases are useful whenever a more accurate, more precise one would exude an unfortunate air of truth, and here one is deployed with care. "False positives" would give us some precision here in place of "weak signals", and it invokes one of the defining features of the process of data mining, which is a moderately useful marketing tool now promoted to the status of a national security crystal ball.
Only, it's never going to work. For example, in the past five years we've seen our airports become hubs of data mining and analysis. Not surprisingly, we've seen many thousands of innocent passengers detained, questioned, bullied, inconvenienced, and embarrassed, while not one terrorist has ever been caught. The rate of false positives appears to be one hundred per cent.
And as for false negatives, surely, in the past five years, at least a few terrorists have flown commercially, and perhaps quite a few. They're not being caught because, unlike the dumb technological tools deployed against them, human adversaries learn. When one thinks of data mining as a threat, one takes steps to avoid detection. Innocent people don't take steps to protect themselves so they get "caught" every day. Meanwhile, the terrorists run rings around the national security agencies and their magic machinery.
The pony in the manure pile
Data mining is good for feeding targeted advertisements to likely punters. It can improve returns on an advertising investment by increasing the likelihood that a consumer will actually find a particular product or service interesting, although it is still an incredibly blunt instrument. Still, if it increases the response rate to an advertisement from, say, two per 1,000 to six per 1,000, it's a real money saver.
But is it a real life saver? We have seen data mining in action in airports, and it appears that every single "detection" has been a false positive. Meanwhile, an unknown number of undesirables continue to move about via commercial air travel. We can't know how many times this has happened, but it doesn't matter. Even if it's happened only once, the rate of false negatives, too, is 100 per cent.
Terrorists and criminals are caught when they make mistakes. They confide in the wrong person and are ratted, or their communications are intercepted, or they arouse suspicion in the real world because of their behaviour. The so-called Millennium bomber, Ahmed Ressam, was not picked out from any data set; he acted strangely in front of a border guard and was investigated based on a gut feeling which turned out to be spot on.
So when Snowden talks about "analysts trawling through huge amounts of data" as if this were some discredited, archaic practice, we should bear in mind that people, unlike machines, have instincts and gut feelings sharpened by years of experience that make dealing with data a far more productive, if less convenient, business. People recognise patterns; machines stink at it. And people imagine patterns, which is another place where the intuitive process can come in. And we need it, because that human intuitive process, however flawed, is the best protection we've got, and ever will have, against human adversaries who imagine, and think, and learn.
But Singapore will give this gimmick a whirl, and so long as they can afford it, and don't actually rely on it, little harm will be done. But the mere fact of it being in use makes it more likely that other countries will adopt it, so it will likely spread. And eventually, it will be relied on, although the worst will happen nevertheless. Instead of securocrats hauled before Congress to answer the question "how did your people fail to connect the dots?" government CIOs/CTOs can answer the question, "how did your multi-billion-dollar miracle system fail to connect the dots?"
Unfortunately, no one will accept the only true answer: "we do our best, but in spite of it all, bad things inevitably happen". Rather, there will be a call for easier access to more data and more legal power to demand it. "Give us the tools we need, Senator, and we will deliver."
Governments across the globe are already engaged in data mining and analysis to a degree unimaginable a decade ago. But much of it is confined to single agencies. The next logical step is to unite the databases, according to Poindexter's ambition. It's not going to work, and it can well be criticised on grounds of wasting money and resources -- but from a privacy point of view, really, who cares at this point? If the FBI is already reading my email and listening to my phone calls without a warrant - if the TSA is already scouring my credit history every time I book a flight - why should I care if the DOD can as well?
We might as well invite everyone to the privacy-invasion party. If one loses one's virginity to a single rapist, one doesn't retain 'more virginity' than they would if they'd lost it in a gang rape. ®