Moving objects with your mind has always been an element of science fiction, popularized as an iconic feature of the “force” in the pop culture-defining series “Star Wars.” But recent advancements in brain-machine interfaces, or BMIs, are working to bring fantasy to fruition. With the power of deep learning algorithms, even lab rats are becoming tiny Jedis, able to move virtual objects with only their brains. Their telepathic abilities offer us a promising glimpse into the future of computerized prosthetics and potentially restoring lost motor ability to individuals in need.
“With the power of deep learning algorithms, even lab rats are becoming tiny Jedis, able to move virtual objects with only their brains.”
Dr. Albert Lee and his team of researchers at the Howard Hughes Medicine Institute were interested in hippocampal activity and its role in imagining and recollecting movements. They connected lab rats to BMIs which monitored the firing activity of neurons in their hippocampus. The hippocampus, located deep in the temporal lobe behind the ears, is an important center for memory and learning. It is essentially responsible for the development of our cognitive maps, a mental model of our experienced environment. This cognitive map is developed by “place cells” that fire selectively based on relative or absolute perception of a physical environment. The firing of these place cells can be recorded as electrical signals using electrodes and an external computer.
Three rats were harnessed into an immersive virtual reality (VR) environment made up of a surrounding screen with a VR projection and a spherical treadmill to allow for three-dimensional movement. As the rats moved around on the treadmill, their virtual location was updated accordingly. Hippocampal neural activity was measured as the rats acclimated to their environment, and a decoder was able to associate specific neural firing signals to the rats’ location in virtual space. Then the treadmill was deactivated so that the rats’ physical movement no longer affected their virtual movement. As the rodents adjusted, they were able to recall past locations in their virtual environment, and their BMIs were able to interpret the brain signals from their recollections and project an updated virtual location onto the surrounding screens.
Once Dr. Lee and his team confirmed the rats’ abilities to recall their past locations, they tested their ability to recall object locations in what was known as the “Jedi task.” Rats were able to navigate freely using the treadmill again and were rewarded for moving an object to a desired goal on a virtual screen. Their neural activity was recorded while they moved the objects. Afterward, the treadmill was once again disabled. The rats were able to successfully envision the goal location of the virtual object in multiple trials, and the BMI decoder was able to convert their brain signals into virtual object movement on their surrounding screen.
The rats were able to envision specific locations in their cognitive map through the underlying mechanism of selectively activating hippocampal neurons. As model animals, these results in rodents give insight into how humans are able to vividly recall physical environments and movements. As BMI technology advances to better interpret brain signals, there is potential for better mental control of motor aids and prosthetic limbs.
“As model animals, these results in rodents give insight into how humans are able to vividly recall physical environments and movements.”
Motor disabilities due to limb loss or paralysis greatly impact quality of life and individuals struggle with carrying out daily tasks. Currently, prosthetics and mobility aids rely on signals from sensors or electrical signals from muscles. Advancements in BMIs could allow neuronal motor signals to be directly converted into commands for computerized prosthetics such as prosthetic limbs, electric wheelchairs, exoskeletons, or virtual keyboards. With greater fine motor control of mobility aids, there is a potential for increased quality of life for amputees and individuals with other motor disabilities.
“‘Jedi rats’ sounds like an epic sequel concept for a beloved franchise, but these rodents hold serious significance in an increasingly computerized world.”
“Jedi rats” sounds like an epic sequel concept for a beloved franchise, but these rodents hold serious significance in an increasingly computerized world. As deep learning and AI become more intertwined with our daily technology, they’re on track to play a major role in medicine. Combining the traditional practice of animal models with novel advancements in algorithmic modeling has the potential to bring about a new hope of restoring motor function to individuals with disabilities across the galaxy.