News

Press room

Giovanni Ortolani
Public Information Officer
Via Beirut, 6
Enrico Fermi Building, Room 112
Office: +39 040 2240-324
gortolani@twas.org

Cristina Serra
Staff writer
Via Beirut, 6
Enrico Fermi Building, Room 113
Phone: +39 040 2240-429
Mobile: +39 338 430-5210
cserra@twas.org

Sean Treacy
Staff Writer
Via Beirut, 6
Enrico Fermi Building, Room T8/2
Office: +39 040 2240-538
streacy@twas.org

General contact: info@twas.org

TWAS Newsletter
The Academy's quarterly magazine. Download PDF files of individual…

Learning to see through an insect’s eyes

Learning to see through an insect’s eyes

TWAS Fellow and Medal Lecturer Mandyam Srinivasan bridges biology and robotics through research on the fascinating ways that bees and other animals perceive and navigate through their environments

To a bee, the world looks very different. Their large segmented eyes give them a panoramic view of their environment. But they’re also closer together, so they have more difficulty triangulating the distance of objects the way humans do. 

So, they need to come up with other ways to judge the proximity of objects in order to perform precise flight manoeuvres as they go about their lives. And, as it turns out, these methods are useful for flying robots, as they too must automatically react to the world around them.

This is the subject of research by TWAS Fellow and 2021 Medal Lecturer Mandyam Srinivasan, Emeritus Professor of Visual and Sensor Neuroscience at the University of Queensland in St. Lucia, Australia. Based on his accomplishments in this field, he was awarded a TWAS Medal and invited to deliver the relevant Medal Lecture on 3 November, during the Academy's Fifteenth General Conference. TWAS Medal Lectures were established in 1996, and the Conference typically features two or three leading scholars and Medal winners, who are invited to present their work.

Srinivasan called the award an unexpected surprise. “I don’t even know who nominated me,” he said, “I’m humbled and delighted.” 

In his presentation “Small Brains, Smart Minds: From Bees and Birds to UAVs” Srinivasan discussed visual tricks that small flying creatures have evolved to navigate their environments, and how this information is useful in the interdisciplinary field of biorobotics. 

Honeybees are some of nature’s most impressive navigators. After a worker bee finds a source of food, she will return to her hive and perform a dance that conveys detailed information about the source of the food and where to find it. Then the other bees will know exactly where to go. “All this behaviour is being orchestrated by a brain that weighs less than a milligram, and carries far fewer neurons than our own brains,” said Srinivasan. “So, one of the missions of our lab is to find out what makes these wonderful creatures tick, and tick so well.”

The distance between their eyes is also very small, which means they don’t have very good depth perceptionat least, in the sense that humans might understand it. Human beings can determine how far an object is by triangulating with both their eyes, but bees need to be in motion themselves, to understand how far away something is. They can do this because objects that are close by appear to move more quickly than objects that are far away, and their tiny bee brains are capable of calculating distance based on this speed-of-motion. 

Srinivasan and his colleagues tested and proved this theory through a serendipitous observation: bees sometimes flew into their lab through a hole in the wall—perfectly, right down the middle. They created an experiment with a tunnel for bees to fly through. Both walls of the tunnel carried vertical stripes. When the walls were stationary, the bees flew precisely down the middle of the tunnel. However, when one wall was moved against the bee’s flight direction, it created an illusion of being closer, causing the bee to veer away from the wall. But when the wall was moved in the same direction, it appeared to be farther away, causing the bee to veer toward it. This demonstrated that bees navigate narrow passages safely by balancing the speeds of motion they visually sense.

Srinivasan and his colleagues were able to further refine their research findings by placing food at the end of the striped tunnels, and later observing what each worker bee communicated to her hive mates using the language of dance. The longer the bees waggle in their 'dance', for example, the further the distance of the food. And the bees gauge this distance by measuring the total amount of image motion that they have experienced on the journey from the nest to the food source. 

“It turns out that the bee dance is a wonderful window into the bee’s mind, because it allows us to tap into the bee’s perception of how far it thinks it has travelled under various experimental conditions that we set up in the lab,” said Srinivasan. 

Further research from Srinivasan and his team also involves research on the flying skills of common parakeets. They found, for example, that parakeets are able to tell if a gap they have to fly through is smaller than their wingspan, forcing them to tuck their wings in mid-flight to get through safely. They also repeated the bee experiment with stripes on the walls of a corridor, only with parakeets this time, and demonstrated that the birds also used the image motion to determine object distance while flying, showing that numerous animals have developed this useful ability.

The research on bee and bird vision has novel applications for automation. An insect is completely self-reliant, and doesn’t rely on external communication, such as global positioning systems derived from satellites, or radar, as many modern robotics systems do. So, potentially, a flying robot with visual capabilities could use a bee’s motion-to-distance calculations to navigate its environment without any external help.

During the TWAS General Conference, Srinivasan presented a video of such a robot, successfully taking off, cruising for a while, turning around and landing again, using only visual cues from the motion of objects in its environment, navigating much like a bee does. 

Applications exist for the developing world, as well, he noted. In parts of the world where GPSs are likely to be unavailable, robots need internal guidance systems that are cheap and lightweight and can navigate on their own. This way, some day, even technologically remote regions of the world could have access to a technology that might enable important services like crop inspection or medicine delivery. 

Sean Treacy