nav search
Data Center Software Security Transformation DevOps Business Personal Tech Science Emergent Tech Bootnotes BOFH

Data ethics in IoT? Pff, you and your silly notions of privacy

Children will die, companies will shout 'sue me then,' and you'll still be using Facebook

By Gareth Corfield, 26 Oct 2016

IoT World Congress The future of personal data sharing is that “everything will become as-a-service” and nobody will own any property outright ever again, a gloomy lawyer told a wide-ranging data ethics discussion at IoT Solutions World Congress this afternoon in Barcelona.

Painting this cheery picture was Giulio Coraggio of international law firm DLA Piper. He was sitting on a panel discussion about data ethics, along with half a dozen other speakers who all disagreed about the ethics of data use and privacy within the Internet of Things.

“With the digital innovation we will not own anything. We will not own our car, there will be car sharing; we will not own our house. Everything will become as-a-service,” cried Coraggio. “People who now don’t care much about their privacy, they will see their privacy as their main asset.”

Uplifting stuff, for sure. He makes a good point: the old adage about the user himself being the saleable product of free-to-use services holds true today, looking at social media networks.

“We should think about data ethics as an industry-wide obligation,” countered David Blaszkowski, a former regulator and the MD of the Financial Services Collaborative. “The IoT industry has the chance from the beginning to do the right thing.”

Whether or not it has the chance to do the right thing, in some cases there are only different shades of doing the wrong thing. Prith Banerjee, CTO of Schneider Electric, cut in with an extended and slightly confused metaphor about a driverless car deciding to plunge into a bus full of schoolkids: “40 children may die!”

Alarming stuff, but not quite on the topic of “data ethics” the audience was led to believe. Some got up and left.

“A machine collects so much info that it’s almost impossible to understand why it took the decision to go off the road or to crash into the bus,” offered Coraggio. “If we tackle these issues you’ll have to structure your technology to make some ethical decisions. Otherwise it’ll be potential litigation, and litigation certainly will be triggered if your car goes into a bus with 40 children; you will lose.”

Edy Liongosari, chief research scientist and global MD of Accenture, the discussion moderator, valiantly tried to get things back on track: “How should we be thinking about the ethics part of this? What can people do?”

Sven Schrecker, Intel’s chief architect for IoT security solutions, said that lawyers should definitely not be left in charge of making the IoT industry pay attention to data ethics. “A bad way to solve this problem is with litigation. Put liability on and those business drivers will go the wrong way: ‘maybe I can afford to be sued?’ One way to do it, but not a good way.”

As for the data being quietly slurped up about you and me and stored for later analysis, Schrecker hit the nail on the head when he said: “The privacy violations are not a single exposure or release of info, but an aggregation of little, unrelated pieces of information where, when you put them together, you see a completely different picture. Over time, without knowing it, we have given up these little morsels of information. We’ve actually given up a great deal of privacy.”

Panellist Derek O’Halloran contemplated the scale of the task facing the minority of the IoT industry that actually takes data privacy seriously: “If all we had to do was solve [the problem] for privacy, our jobs would be a lot easier. If all we had to do was solve for security, everything would be fine. If all we had to do was solve for economic activity, we’d open it all up. It’s the balance between these that makes it so different.”

A wise observation, and one that put the extended robocar-bus-mass-child-killing metaphor into perspective. Scaremongering helps nobody and has already caused large chunks of the non-technical world to tune out security warnings almost completely. Considering the real problems that can – and need to – be solved today is the key.

O’Halloran concluded: “This balance between what’s good for the individual and what’s good for society, how do we resolve that? We’re not going to resolve it because it’ll be an ongoing discussion, as it always has been.”

Back to normal jogging, then. All hail the inevitable rise of Red Dwarf’s Talkie Toaster as humanity's new overlord. ®

The Register - Independent news and views for the tech community. Part of Situation Publishing