This article is more than 1 year old

2016: The Rise of the Intelligent (cloud) Machines

Only smart survives the cloud consolidation

Review of 2016 Blame Mark Zuckerberg. Not for the election of Donald Trump as US president, but for Artificial Intelligence becoming the trend du jour in enterprise tech circles in 2016.

Back in those now forgotten days of January, before The Great Inversion of 2016, Zuckerberg was surely kicking his heels when he set himself that “personal challenge” to build a simple AI, which he said would be like: “[l]ike Jarvis from Iron Man - to help run my home and help me with work.”

Since then, Zuckerberg’s challenge has taken on a different, rather more pressing dimension, one which even the deliverance of his actual Jarvis couldn't distract us from: elevating Facebook from the mire of fake news.

Whether Zuckerberg was first to the punch or simply the cipher for a trend, he succeed in putting the stamp of AI on 2016 - at least in enterprise circles - and 2016 broadly became the year when anything AI-related could generate headlines.

But if your idea for December was for a piece of AI more sophisticated than a voice-activated version of Nest controlling all Zuck’s domestic appliances - and not just his central heating - sorry. Like British supporters of the membership of the European Union and those Americans who voted for Hillary Clinton, the year didn’t quite go your way.

Turns out Zuckerberg's AI project was a journey not a destination - a learning ”experience." Dare I say, a machine-learning experience. And that, was the point.

What 2016 represented was a small step on a very long road to – well, somewhere towards an AI future. Of a kind.

For a subject that boils down to two letters, AI is complicated.

What we got this year wasn’t AI – not even the building blocks of AI. In fact, it was entirely possible that when people talked AI or read about AI, not everybody was on same page or talking about the same thing.

Like intelligence in general, Artificial Intelligence isn’t born - it must be learned. Hence 2016 was more like the year of Machine Learning (ML).

And it wasn’t just about ML pure and simple, because beneath ML and AI are sub-technologies such as neural networks and natural language processing and beneath these are parallel processing, data and advanced algorithms. You also need the frameworks to run them and tools to build them. In that respect, what 2016 was actually about was the tools and frameworks needed to achieve ML.

After several years of Siri and Cortana, 2016 was about giving out the tools for building AI to others – to those outside the techies in Apple's Cupertino or Microsoft's Redmond HQ responsible for these personal assistants that greased the pump of AI.

Only the big names of enterprise tech couldn’t call what they were doing something so prosaic as “developer tools” - they couldn’t. Too in the weeds. That would have marginalized the pitch. For West Coast tech, a really big vision was needed and depending on who you talked to that really vision was: the “democratization of AI,” it was “mainstream machine learning” or it was “easy” deep learning. Different words, but a common theme, from Microsoft, Google and Amazon’s AWS.

What they meant was AI and ML that was accessible, affordable and usable.

AI and ML was a convenient play for the cloud providers, too. ML requires masses of parallel compute power and huge volumes of data. That says cloud.

But beyond sexy soundbites and throwaway headlines, what did “democratization” look like? In a word: Tools.

Vendor AI plays

Google drew on its history of creating and publishing big architectures to seed the market and embed its technologies.

It updated TensorFlow, the open-source framework for ML you use to feed computers large amounts of information, which the search giant uses in its speech recognition, Gmail and Google Photos. Released to beta was Google Cloud Machine Learning, a managed service for building machine-learning models using TensorFlow. Google ML uses Google assets such as DataFlow, Storage and BigQuery, thus helping serve its late-to-the game cloud to customers.

Google provided the data to stuff into your TensorFlow framework, too, releasing eight million video image URLs from YouTube, YouTube-8M, along with the URLs to a further nine million images under OpenImages for stuffing into your ML framework of choice, forcing even more, er, learning on it.

Microsoft drew on its own historical pull among developers, established during the reign of Windows.

Redmond pushed Azure Machine Learning for to you configure pre-built models and recommendation engines and decision trees with software for the statistical computing and graphics language R from Revolution Analytics it bought in 2015.

The Cortana Analytics Suite was rebranded the Cortana Intelligence Suite, with APIs added for natural methods of communication.

With an eye on headlines, Microsoft tackled bots. It released a Bot Framework for developers to build NodeJS and C# based conversation bots for text/SMS, Office 365, Skype, Slack and the web, there was a second beta of a Microsoft Cognitive Toolkit for speech and image recognition, and a new Cortana SDK.

Take a bot called Tay

But it was a Microsoft named bot Tay that really got the headlines – for the wrong reasons.

Described as an experiment in “conversational understanding” Tay was supposed to learn the more you talked to “her” but what Tay demonstrated was the benefits of receiving a good education and that given the “wrong data” even a well-intentioned bot can turn bad.

Tay was released on Twitter but within 24 hours of appearing, Microsoft’s cool bot had evolved from saying that humans are cool to reciting a stream of reciting racist, misogynist and Donald Trump (covering, but not necessarily limited to, both of the former) remarks. Microsoft pulled Tay but the firm was back in December, albeit chastened, with Zo - an invitation-only bot available through messaging app Kik. Zo was to receive a private education.

But where AWS, a prime cloud mover and the responsible for pulling along the other two?

AWS released a Deep Learning AMI to make distributed deep learning “easy” and the Alexa-based Eco personal assistant, released in the UK in September but by year’s end all AWS could offer was a bashful: we’ve been doing ML for years on Amazon.com’s listings and recommendation engines. We just don’t talk about it.

Amazon selected MXNet from half-a-dozen frameworks as its official TensorFlow, the framework on which it would build its future ML and AI efforts. Designed “scalable enough,” by Google, the giant called on open-sourcers to also adopt MENet, which comes courtesy of the University of Washington and Carnegie Mellon University.

That was it? No, AWS did announce the text-to-speech conversion service Amazon Polly, image analysis tool Amazon Rekognition and Amazon Lex for building “conversational user experiences” into applications. That was it? AWS promised “more” in 2017. That was it? Yes, Dorothy, that was it.

It took 10 years for cloud to reach where it is: going from a coders' platform for dev and test to a fundamental part of an organisation's IT infrastructure - something the CIO hadn't just heard of but cared about.

Arguably, 2016 will be looked back on as the start of a similar 10-year cycle for ML. The big vendors in cloud IT infrastructure recognised that and have started small by targeting developers.

RedMonk analyst Steven O’Grady encapsulated the power developers have over IT adoption some time ago, calling them “king-makers”. They are the gatekeepers of technology in an enterprise. Developers turned Windows into a repeatable success for Microsoft, sucked open-source into corporate IT, and established AWS’s momentum when everyone else was looking the wrong way.

Only there was one more thing. Cloud by 2016 is becoming an established story – yes, still growing with more to come. But what’s became clear is to win the king makers you had to stay fresh.

2016 saw two traditional names of enterprise IT surrender on cloud.

Fluffy white smarts

Hewlett Packard Enterprise (HPE) offloaded its entire software business, including development of its OpenStack Helion cloud to the tiny MicroFocus-owned Linux spinner SuSE. The Reg revealed Cisco would in 2017 kill its $1bn Intercloud public service. Best not to mention Oracle, another big name from traditional enterprise IT, late on infrastructure cloud and vying for position a long way behind.

Late, lagging and lacking momentum are nowhere to be in cloud - as HPE and Cisco accepted.

Microsoft has spent the best part of a decade trying to close the gap on AWS, adding the same or similar features only for AWS to move the target, moving up into the application layer or adding new types of compute instance for HPC.

After nearly a decade, Microsoft has proximity. Moreover, it has developers, developers, developers. And, thanks to copious open-sourcing, Google also has developers.

It was therefore a telling sign of the arrival of AI and ML in that these rivals should make take that most traditional of industry steps and form an alliance - The Partnership on AI to Benefit People and Society with IBM, DeepMind and Facebook. The Partnership will conduct and publish research on AI and develop best practices.

As with web services, identity, high-performance computing, open-source and data centers years fore, those competing in a new and emerging field have contained their commercial differences to create a big market. And, as, before, expect peeking over the shoulder, sharing of notes, new ideas and standards.

So, how is this outbreak of democratisation faring in the real world?

The king-makers are eating up the code judging by the number of forks of TensorFlow on Github. But beware of early signs of victory. The true test of a technology’s genuine uptake is not measured by forks or downloads, but by long and sustained uptake and a critical mass. For every Java there is a Ruby on Rails: a gravitational giant versus a star that burned brightly briefly, ready to change everything but didn’t.

And for the real, real world?

I spoke to corporations on both side of ML fence in 2016: online retail giant Ocado, a Google customer, took on TensorFlow to streamline its call-center email handling and speed up responses. Success. The head of digital for a rail firm, who shall remain nameless, in the spring reckoned he was talking to different people about using bots and said he expected to use a bot as part of his company's customer support with video and support articles within a year. When I checked back, things sounded a little less equivocal: the talk had shifted to looking “assessing the maturity of bot technology” and the role bots “may" play in customer support.

2016 was certainly the year ML crossed into enterprise tech - at least conceptually. And in hindsight, 2016 may be seen as the equivalent of 2006 - the year Amazon launched AWS - a year that marked the start of a journey for a technology platform from beginnings, through maturity to mass-uptake and didn't just change the fortunes of single firm but reshaped an industry.

Much will depend how the cloud providers serve their rulers, the king-makers, in 2017.®

More about

TIP US OFF

Send us news


Other stories you might like