Future of robotics vision is a praying mantis (wearing tiny red glasses)

Future of robotics vision is a praying mantis (wearing tiny red glasses)

Accurately judging distance is one of the biggest challenges facing modern robotics.

Now, however, British scientists claim to have made a breakthrough thanks to the example of an unlikely candidate: a praying mantis wearing some fashionably questionable red glasses.

Currently, robot “vision” systems, such those used by drones to pick up packages or navigate around objects, tend to mimic human stereo vision.

Each eye sees a marginally different view of the world and the brain merges the two views to create a single image, while using the differences images to calculate how far away things are.

While it works well for mankind, artificial systems based on the same concept requires significant computing power, which both slows and weighs down robots.

Researchers at Newcastle University therefore looked for other examples of stereo vision in the animal kingdom and focused on the praying mantis, the only insect know to possess it.

The insect was fitted with miniature 3D glasses

The insect was fitted with miniature 3D glasses

Credit:
Mike Urwin

To investigate the bug’s sight system, the team created bespoke 3D red glasses which they temporarily glued on with beeswax.

They showed the insect moving images of prey, as well as complex dot patterns which are also used to investigate human 3D vision.

The experiments found that, because mantises only attack moving prey, their neurological processes do not bother to compare the details of still pictures in each eye.

Instead, they judge distance by simply looking for places where the picture is changing, and they do it better than humans.

Published in the journal Current Biology, the study concluded that the praying mantis vision system is “very robust” and simpler than that used by humans, meaning it could provide a far better template for robots.

Dr Ghaith Tarawneh, who worked on the research, said: “Many robots use stereo vision to help them navigate, but this is usually based on complex human stereo.

“Since insect brains are so tiny, their form of stereo vision can’t require much computer processing.

“This means it could find useful applications in low-power autonomous robots.”

A significant proportion of current robotic research is dedicated towards drone technology, where simplicity and lightness of operating systems is at a premium.

In December 2016 Amazon successfully trialled its 30-minute fully-autonomous drone package delivery system in Cambridgeshire for the first time, a model that is thought will lead to the dramatic increase of the number of robots in the sky

The Newcastle scientists suggested roboticists working on vision systems have so far been asking themselves the wrong question.

“This is a completely new form of 3D vision as it is based on change over time instead of static images,” said Dr Vivek Nityananda, a behavioural ecologist.

“In mantises it is probably designed to answer the question “is there prey at the right distance for me to catch?”.”

While it is widely assumed robots will play an increasingly central role in everyday life, early attempts to replace humans performing more complex tasks have had mixed results.

Last month Britain’s first cyborg shop assistant was sacked within a week of beginning work because it was confusing customers.

Although “charming”, Fabio the ShopBot lost his position with Scottish supermarket Margiotta for giving unhelpful answers in response to customer enquiries.

Leave a Reply

Your email address will not be published. Required fields are marked *