It’s difficult to image an organism with a brain smaller than the period at the end of this sentence successfully maneuvering around obstacles while following fast-moving prey on the wing, especially for those of us who periodically trip over a curb or bump into a door frame.
Researchers from the University of Minnesota and Imperial College London published a new study in the Journal of Experimental Biology that demonstrates how a tiny fly can accomplish just that, providing vital insights for efforts to create robots, drones, and other devices.
Paloma Gonzalez-Bellido, Mary Sumner, and Trevor Wardill of the University of Minnesota’s College of Biological Sciences, as well as Sam Fabian of Imperial College London’s Department of Bioengineering, investigated the aerial feats of a miniature robber fly known as a gnat ogre adult, which is only 7 mm in length on average.
The gnat ogre, which is native to North and South America, is recognized for its extraordinary precision in pursuing and capturing other insects in flight. It’s impressive enough that this insect’s tiny brain can direct it to catch a moving object.
Even more impressive is its ability to avoid colliding with obstacles at the same time. The scientists wanted to know how the small fly mixes the two sets of brain-to-muscle commands.
We discovered that simple visual feedback alone reacting to things rather than predicting ahead can be used to quickly solve complex navigation challenges. This work shows that even creatures with comparatively tiny brains are quite capable of performing extreme and precise behavior at speeds we can barely see, let alone appreciate.
Fabian
“Predatory lifestyles put a premium on neural performance to move quickly and precisely, and this pressure is exacerbated in miniature animals because they have fewer neurons,” said Gonzalez-Bellido, who leads the Fly Systems Laboratory (FLYSY) at the University of Minnesota.
“Still, Gnat ogres intercept their prey similar to catching an over-the-shoulder pass in football so we wanted to know how flexible their strategy is, and if these flies could cope with additional challenges during the interception, such as obstacles on their path.”
They looked for an explanation by observing gnat ogres chasing a moving object with the use of plastic bait, fishing wire, and high-speed video. The researchers discovered that gnat ogres continuously adjusted their path based on a mix of visual stimuli when comparing video recordings of the fly chasing the bait in the presence of obstacles with flight trajectories predicted by models of obstacle-eluding flight and moving-object-pursuing flight.
The bug was likely to cease the chase if the obstruction was substantial enough to obscure the prey for more than 70 milliseconds. The chase continued after the fly cleared the obstruction, even though the line of sight was scarcely broken.
“We discovered that simple visual feedback alone reacting to things rather than predicting ahead can be used to quickly solve complex navigation challenges,” says Fabian, who completed his Ph.D. in the FLYSY Lab.
“This work shows that even creatures with comparatively tiny brains are quite capable of performing extreme and precise behavior at speeds we can barely see, let alone appreciate.”
The ability of the fly to change its direction so quickly is attributed to its small size, which allows signals to flow quickly from the eye to the brain to the flight muscles. Future studies will look into how small animals get information about their target before taking off and how they know what to attack. The findings could have ramifications in other disciplines that are looking into nature-inspired innovation.
“Current robotics technology tends to use extra, expensive sensors to conduct tasks like obstacle avoidance (e.g. LIDAR or RADAR). However, animals, like our robber flies, manage to conduct multiple tasks simultaneously using information only from their visual system (i.e. tracking the motion of a distant target and processing the position and expansion of potential obstacles), and on a tiny energy budget,” says Fabian.
“Getting a clearer understanding of how they combine this sensory information to generate accurate and rapid behavioral responses to complex navigational challenges could help inspire future innovation in the robotic sensing capabilities.”
This research was supported by the United States Air Force Office for Scientific Research, Isaac Newton Trust, Wellcome Trust, University of Cambridge, Biotechnology and Biological Sciences Research Council and Imperial College London.