news

Technology Based on Rodent Neurons May Point the Way to the Tactile Internet

Primary tabs

Imagine you are playing an immersive game in which you are dropped into an unknown landscape with a directive to find a certain location. To advance forward in the game, you must also map the terrain so that you can then share your initial location and your map with another remote player. You have now been given a problem that, within the world of robotics is called SLAM. You have been asked to simultaneously localize and map an unknown environment.

Various algorithms are used to perform SLAM solving for robotics, autonomous control, and VR/AR navigation, each tailored for optimization of resources. Bioinspired solutions to SLAM were proposed after research conducted on rodents exhibited  hippocampal place fields, patterns of neural activity that correspond to locations in space. The place fields are activated by visual stimuli and the act of locomotion by the rodent as it moves about. “This corresponds with the problem of correlating odometric and range or vision sensors in a mobile robot; the problem at the heart of SLAM.” 1 Further experiments using this model allowed robotic agents to create representative maps without external input from the user.

With the ever-increasing demand for autonomous robotics to operate in the most visually ambiguous environments with the least amount of resources necessary, a team at Georgia Tech has developed the NeuroSLAM accelerator IC to support ultra-low-power visual SLAM applications in edge robotics.  In the proposed IC, the SLAM algorithms are not implemented digitally; rather, the constant interactions between the neuronal cells, activated by the visual stimulii and responsible for solving SLAM, are emulated using the natural dynamics of a system of coupled mixed-signal oscillators. The NeuroSLAM IC is the first bio-inspired SLAM IC employing the principles of spatial cognition of rodents while considering hardware-efficient designs.  Future applications of such robotics include remote/harsh environment search and rescue, unknown environment exploration for defense and security, enhanced autonomous vehicles, and enhanced VR and human-robot interactions.

Now imagine a future where you are playing an immersive VR game, controlling a tactilely sensitive robot to explore a unknown landscape. To advance to that point in the game, we may need to look to rat neurons and NeuroSLAM architectures.
 
Read the Research on NeuroSLAM Here
 
1.” RatSLAM: a hippocampal model for simultaneous localization and mapping” https://ieeexplore.ieee.org/document/1307183

- Christa M. Ernst

Status

  • Workflow Status:Published
  • Created By:Christa Ernst
  • Created:02/12/2021
  • Modified By:Christa Ernst
  • Modified:02/15/2021