Bats and dolphins emit sound waves to sense their surroundings; like a battery, electric fish generate electricity to help them detect motion while burrowed in their refuges; and humans use tiny movements of the eyes to perceive objects in their field of vision.
Each is an example of “active sensing” — a process found across the animal kingdom, which involves the production of motion, sound or other signals to gather sensory feedback about the external environment. Until now, however, researchers have struggled to understand how the brain controls active sensing, partly due to how tightly linked active sensing behavior is with the sensory feedback it creates.
In a new study, NJIT and Johns Hopkins researchers have used augmented reality technology to alter this link and unravel the mysterious dynamic between active sensing movement and sensory feedback. The findings report that subtle active sensing movements of a special species of weakly electric fish — known as the glass knifefish (Eigenmannia virescens) — are under sensory feedback control and serve to enhance the sensory information the fish receives. The study proposes the fish use a dual-control system for processing feedback from active sensing movements, a feature that may be ubiquitous in animals.
Researchers say the findings, published in the journal Current Biology, could have implications in the field of neuroscience as well as in the engineering of new artificial systems — from self-driving cars to cooperative robotics.
“What is most exciting is that this study has allowed us to explore feedback in ways that we have been dreaming about for over 10 years,” said Eric Fortune, associate professor of biology, who led the study at NJIT. “This is perhaps the first study where augmented reality has been used to probe, in real time, this fundamental process of movement-based active sensing, which nearly all animals use to perceive the environment around them.”
Eigenmannia virescens is a species of electric fish found in the Amazon river basin that is known to hide in refuges to avoid the threat of predators in their environment. As part of their defenses, Fortune says that the species and its relatives can display a magnet-like ability to maintain a fixed position within their refuge, known as station-keeping. Fortune’s team sought to learn how the fish control this sensing behavior by disrupting the way the fish perceives its movement relative to its refuge.
Caption: A species closely related to the glass knifefish, the brown ghost knifefish (Apteronotus leptorhynchus), displays its station-keeping ability. Credit: NJIT/Johns Hopkins.
“We’ve known for a long time that these fish will follow the position of their refuge, but more recently we discovered that they generate small movements that reminded us of the tiny movements that are seen in human eyes,” said Fortune. “That led us to devise our augmented reality system and see if we could experimentally perturb the relationship between the sensory and motor systems of these fish without completely unlinking them. Until now, this was very hard to do.”
To investigate, the researchers placed weakly electric fish inside an experimental tank with an artificial refuge enclosure, capable of automatically shuttling back and forth based on real time video tracking of the fish’s movement. The team studied how the fish’s behavior and movement in the refuge would be altered in two categories of experiments: “closed loop” experiments, whereby the fish’s movement is synced to the shuttle motion of the refuge; and “open loop” experiments, whereby motion of the refuge is “replayed” to the fish as if from a tape recorder. Notably, the researchers observed that the fish swam the farthest to gain sensory information during closed loop experiments when the augmented reality system’s positive “feedback gain” was turned up — or whenever the refuge position was made to mirror the movement of the fish.
Caption: Fortune’s lab uses real time video tracking of Eigenmannia virescens in an artificial refuge environment to learn how the fish control sensing behavior used for station-keeping. Credit: NJIT/Johns Hopkins.
“From the perspective of the fish, the stimulus in closed- and open-loop experiments is exactly the same, but from the perspective of control, one test is linked to the behavior and the other it is unlinked,” said Noah Cowan, professor at Johns Hopkins University and co-author of the study. “It is similar to the way visual information of a room might change as a person is walking through it, as opposed to the person watching a video of walking through a room.”
“It turns out the fish behave differently when the stimulus is controlled by the individual versus when the stimulus is played back to them,” added Fortune. “This experiment demonstrates that the phenomenon that we are observing is due to feedback the fish receives from its own movement. Essentially, the animal seems to know that it is controlling the sensory world around it.”
According to Fortune, the study’s results indicate that fish may use two control loops, which could be a common feature in how other animals perceive their surroundings — one control for managing the flow of information from active sensing movements, and another that uses that information to inform motor function.
Fortune says his team is now seeking to investigate the neurons responsible for each control loop in the fish. He also says that the study and its findings may be applied to research exploring active sensing behavior in humans, or by engineers in developing advanced robotics.
“Our hope is that researchers will conduct similar experiments to learn more about vision in humans, which could give us valuable knowledge about our own neurobiology,” said Fortune. “At the same time, because animals continue to be so much better at vision and control of movement than any artificial system that has been devised, we think that engineers could take the data we’ve published and translate that into more powerful feedback control systems.”
Other authors include graduate student Debojyoti Biswas, undergraduate researcher Luke A. Arend, postdoctoral fellow Sarah A. Stamper, and associate research engineer Balázs P. Vágvölgyi, all from Johns Hopkins University.
This work was supported by James McDonnell Foundation Complex Systems Scholar Award grant 112836; Collaborative National Science Foundation Award, grants 1557895 and 1557858 and National Science Foundation Research Experiences for Undergraduates grant 1460674.