This article is more than 1 year old

Get a GRIP! Robolution ain't happening until TOUCH is cracked

Forget computer vision and AI – why Ocado's on the money

Predictions are rife about the millions of repetitive, administrative and operative roles set to be decimated by automation over the coming years.

Robots could undoubtedly make a positive contribution to the UK's pitiful levels of productivity and make up for the dearth of people willing to take on the many tedious roles most likely to be affected.

Robots are more commonly associated with manufacturing but for British retailers, they present a new opportunity as companies try to keep pace with consumer demand, increased competition and lower prices.

They are a way to speed delivery while controlling costs.

It's one of the reasons why Ocado, the world's largest online food retailer, is eyeing robots that could pick and pack products as part of its vision for highly automated warehouses.

The online retailer has three warehouses in the UK, serving 50,000 lines to just over half a million customers.

At Ocado's oldest warehouses, in Hatfield and Dordon, human packers pick items for customers' orders into grey Ocado bags that are preloaded inside Ocado's red delivery crates that move down a conveyor-belt system.

The individual goods are retrieved from storage further back and delivered to packers' stations via crates travelling on tracks spanning buildings more than one million square feet in size.

An unlikely innovator

Ocado's newest warehouse, in Andover, has introduced a grid of network-controlled robots that retrieve and drop goods to human packers in a facility equal in size to a football pitch over several levels.

But Ocado's technology division has begun piggybacking off a collaborative £7m EU-funded research project across several universities to create a robust, cost-effective and safe robotic arm that could take on packing.

The focus of that project? A soft robot hand suitable for handling fragile objects as much as hard items without much detailed knowledge of an item's shape – so more like a human hand.

Sounds simple, but the problem – and it's one that will apply to robots in wider society – is how to deal with an unlimited variety of products or scenarios without the need for constant reprogramming.

Because for all the talk about robots in the workplace, mechanical hands – as we know them – aren't very good at distinguishing objects or changing their grip to accommodate them.

In the retail setting, just like the human hands, the robot's appendage must be able to pick up and transport easily damaged and unpredictably shaped objects, such as a piece of fruit, without crushing it in a vice-like grip as much as picking up something substantial and solid as, say, a litre bottle of water.

Oh yes, and if robots could replicate the quality control aspects of the picking job, that would also be good.

This isn't a problem limited to retail. If robots are to succeed in the wider workplace we need multi-function machines capable of working in different settings rather than specific robots built for specific jobs, or unable to work in certain areas of the economy because they can't pick up and handle.

Enter the Soft Manipulation project

In many ways, Ocado Technology is picking up the baton for the business community as a whole and it's working with researchers at a group of European universities through The Soft Manipulation (SoMa) project.

SoMa is using a compliant hand-like gripper that possesses spring-like properties that almost envelops the object rather than grasp at it at certain points, thereby reducing the pressure applied to its surface.

One such compliant gripper that has already been put to the test in an Ocado test facility is the RBO Hand 2 developed by the Technische Universität Berlin (TUB), which uses flexible rubber materials and pressurised air for passively adapting grasps and enabling safe and damage-free picking of objects. Testing using artificial fruit in special storage trays appears to be yielding some success with the hand "able to successfully grasp a variety of shapes" using the bottom and walls of the tray to help it to pick up the objects.

From a retail perspective alone, Alex Harvey, head of automation and robotic systems at Ocado, admits the opportunities are massive – not just in terms of slashing the cost of warehousing but also improving the customer proposition in terms of price and quality of pick – robots unlike employees can't blame poor judgement on simply having a bad day.

"This is a natural extension of our desire to continuously improve our proposition," he says. Retail giant Amazon is also investing heavily in this area, undoubtedly driven by the same bottom-line benefits.

The longer-term vision is not only to automate the process of picking and packing any of Ocado's own 50,000 products but also to improve the machines' sensory capabilities, which could identify whether a fruit or vegetable is ripe enough for shipping. A fully automated picking and packing robot would by all accounts be a killer proposition as Ocado looks to further develop its Ocado Smart Platform outsourced supply chain offering for the retail market.

But it's still early days and the thousands of employees don't need to worry about losing their jobs any time soon. Academics admit that the challenges of creating a robot capable of carefully picking up any object, regardless of its shape, size and texture are a long way off being solved.

Getting touchy-feely

Ocado's Harvey also accepts there are tough challenges ahead, not least working out how robots can repack shopping when something gets damaged or a box is too full. "This is a very long-term goal and isn't likely to be something we solve in the next 12 months. Adding more sensing is something we think is likely to be necessary to complete this successfully."

There's a broader imperative for robots to get touchy-feely – with the practical applications of bots that can sense objects in a more human way potentially extending way beyond the confines of retail warehousing to other areas. There is industry-wide demand for robots that can manipulate objects that are not neatly organised in rows, for example picking fruit from trees.

But non-industrial applications, for example the use of cobots – robots and coexist and cooperate with humans in a service environment – will also rely on hands that are soft, comfortable and safe. "This is an important robotics problem," says Antonio Bicchi, professor of robotics at the University of Pisa and a senior researcher at the Italian Institute of Technology in Genoa, one of the academic collaborators on the SoMa project.

Bicchi should know. He has spent much of his academic career – since 1984 in fact – pondering the robotic challenge of grasping and manipulating objects. Bicchi says the mechanical complexity of the problem should not be underestimated. "You need to be able to move every joint independently so there are many mechanical parts that can break and it's difficult to programme and can take many hundreds of thousands of lines of code."

Robotics has got very good at shadowing and repeating a move with great delicacy and precision, machine vision and other sensing are improving robots' ability to make slight adjustments, "but being able to interact and respond to a dynamic environment is still incredibly difficult," says Ian Hughes, a robotics expert at 451 Research.

But not everyone agrees with his assertion that processing power – the capacity of the brain of a robot – is the limiting factor.

The sheer amount of data needed to replicate soft touch isn't a particular problem, Bicchi says. "Big data and machine learning approaches are having incredible success. But when it comes to manipulating objects you'd need to have robots that manipulate millions of objects millions of times.

No mean feat

"Learning to act physically is not the same as manipulating digital data. You have to learn the ropes by providing a physical substrate and practice. There's nothing you can do but have experience of touching and feeling."

The crux of the problem lies, Bicchi says, with the fact that we still don't properly understand the hugely complicated, but largely instinctive human process of manipulating objects. The human hand is packed with nerves that act as massively complex sensors that allow us to seamlessly modulate the way we touch and handle objects. Replicating that with robotics is no such an easy task, he warns.

"Embodied intelligence refers to the fact that human intelligence is really made by the coexistence of brain and body. Some behaviour is due to the way the body is realised and it contradicts the classical approach to AI such as robots playing chess, for example. Humans are intelligent because they can operate in the environment and that influences the way the brain works," Bicchi explains. "It means we have to understand how the human brain controls the human body."

Ocado's Harvey agrees that recreating everything the human can do is incredibly complicated, and the challenge of developing very general control algorithms that can be adapted very simply for the whole range using a limited amount of hardware is exacerbated by the fact that robots and humans work side by side. "It would be easier to work out and test the algorithms if all the picking were done by robots. Having robots and humans working side by side increases the challenge."

Dr Nathan Lepora is a senior lecturer in engineering mathematics and programme director for the MSc in Robotics at the University of Bristol. He leads a lab of about ten people that makes 3D-printed tactile sensors and hands, and develops algorithms for their use. That lab was recently awarded a £1m grant from the Leverhulme Trust for research.

Engineering millions of years

Lepora's research focuses on having soft hands that can feel what they're holding, thus further extending their potential application. "Because all of this is new, it isn't known how important a sense of touch is for robots, just that present, non-tactile robot hands are far poorer than our own hands, and that humans can't handle objects if our sense of touch is compromised, for example by local anaesthetic.

"It's about bringing together tactile hands and algorithms based on how your brain works. Why is it so difficult? The human hand has evolved over tens of millions of years of evolution. So replicating that in a robotic device is a challenge. There have been major advances in making soft compliant hands in the last few years but if your hand can't feel you can't pick up so our research is adding that sense of touch".

The 3D-printed sensors developed at Bristol cost approximately £500 per fingertip, a fraction of the cost of other similar sensors on the market, "but that's only part of the battle. The hard bit is having the algorithms that can interpret information from that hand to manipulating it. This is a software issue."

Although practical solutions to grasping objects will be available in just a few years, translating the more advanced capabilities of human hands into robots is some way off, Bicchi admits.

"A newborn kid has a reflex of grasping then gradually develops more sophisticated synergies. At the moment we are at the stage of what a one-year-old can do. In most industrial applications, the grasping ability of a one-year-old is enough. Robots don't need to be able to play the piano," he says.

Unlike the computer vision that underpins driverless cars and security systems that use fixed cameras to read information about the world, a sense of touch requires interaction with the world. "It's that aspect that makes it so hard," Lepora says.

"People see a robotics revolution happening but many things won't be achieved without giving robots a sense of touch. I think it’s achievable but at the moment we don’t know how to crack it." ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like