From insect-inspired autonomous vehicles and classification algorithms.
In a not so far away place, there is a complex buzzing society. These almost alien creatures communicate through dance, traveling from their central base of operations to search for food. Flying at high speed, these diligent workers dodge predators while honing in on food supplies. Without any external maps or a GPS device, they navigate on their own using the smells and sounds in their environment. These stimuli activate specific pathways in the brain so that the drone determines the nearest field for foraging. Often, they may need to decide which foraging area has more flowers. They land on their target, collect a payload of nectar and fly back home.
The mighty honeybee conducts these complex calculations with less than 1 million brain cells; their brain is a measly 1mm³.
Despite their small brains, insects have the remarkable capacity to communicate, behave within an organized social community, and quickly make decisions. With modern computing and machine learning algorithms, we want to solve similar problems. What took evolution billions of years to generate may provide us with computational shortcuts. After all, we’ve barely entered the modern computer age.
Insights from these brains allow us to design robots that navigate the world accurately through vision. Additionally, rather than building up neural networks from scratch, researchers turn to neuromorphic-inspired networks. We save incredible amounts of resources and energy if we ever unlock the secrets of these simple circuits. I’ll explore some of the modern use-cases where these ancient brains provide value.
Insect-inspired visual processing
Insects like dragonflies, honeybees, and fruit flies are excellent at navigating their environment. Consequently, their visual processing capabilities are quite impressive. Within a blink of an eye, they change direction to capture prey or avoid collisions. Since autonomous vehicles will need to perform similar tasks, the visual processing ability of insects is of particular interest. Insects are extremely efficient at extracting information about their environment from low-resolution data.
Compact-eyes and motion sensors are built and inspired by insects. Artificial eyes mimicking fire ants or other insects are used to create small drones. Engineers build devices with very few parts but still need to ensure that they can process this data. Algorithms allow these tiny autonomous robots to detect speed, edges, features in their environment, and even travel times.
Many insects also detect direction with a built-in-compass. At the edge of their eyes are cells capable of detecting polarized light from the atmosphere. Based on polarized light patterns, different combinations of electrical signals allow insects to determine their current direction.
A bio-robotics group in Marseille developed ultraviolet light sensor units, mimicking insects. This is important for autonomous vehicles that will need to navigate through cities based on directions. It may also lower the barrier of entry to new groups and companies seeking to develop vehicles.
Insect-inspired visual guidance
Clumsy isn’t often an adjective attributed to insects. They may glide through the air with grace. Insects like dragonflies even lock onto potential prey, detect their speed, and change direction to catch them. Incredibly, dragonflies also fly upside down and quickly rotate as they fly. Consequently, understanding how insects accomplish these feats will help develop efficient methods that will allow autonomous vehicles to avoid collisions.
While the eye itself is a powerful sensor, fast retinal vibrations increase its accuracy. Flies and other insects can differentiate between a background and a small moving object or insect. A small flying robot with a vibrating retinal sensor identifies moving hands on a textured background. Rather than layering more cells and biological machinery, this elaborate mechanism is efficient at increasing acuity. These mechanisms likely help predatory insects accurately locate a target.
Additionally, insects use visual data as they fly, a process called optic flow. Insects simultaneously measure distance and speed. Researchers are hard at work trying to determine exactly how insects extract this information from vision alone. Nonetheless, algorithms inspired by this idea show its feasibility for future autonomous vehicles.
When we are unable to manufacture smaller components for computers, we will need to improve computing architecture. Neuromorphic neural networks take their inspiration from the small size of insect brains. Newer nanoscale devices simulate the central nervous system by emulating features of brain cells. While insects will not learn how to fill out taxes or solve complex math equations, they integrate a lot of environmental data efficiently.
Insects also sense odorants and different chemicals through their sense of smell. This sensitivity is combined with specialized combinations of excitatory/inhibitory neural pathways to encode this abstract data. Insects also compress this data so that it can be stored within their brain. This information helps them classify different types of odors and respond accordingly. A neuromorphic version of this process is invaluable as a classifier capable of receiving a lot of data without using a lot of power.
Crickets direct their movements to sounds in the environment. This complex behavior only needs four neurons to steer a robot. Simply, two neurons located far from each other detect noise while two subsequent neurons move towards the sound. The time difference between the onset of a sound at different neurons helps inform the distance of the source. More complex robots also filter the phase of the sound to navigate in uneven terrains or environments.
We are in the early stages of understanding the applications of insect neurobiology. Nonetheless its clear that their visual prowess is invaluable for developing autonomous vehicles that safely navigate our world. Vehicles may integrate visual sensors inspired by insect eyes to determine heading and speed, to park in a target spot and even to avoid collisions. In parallel, researchers are slowly unraveling the way insects integrate sound and smell.
By mimicking their brains, researchers generate sensors and classifiers that integrate a lot of data without expending a lot of energy. I am excited to see how these technologies develop in the coming decades.
Credit: Google News