event

Ph.D. Proposal Oral Exam - Ashwin Lele

Primary tabs

Title:  Event-based Neuromorphic Systems for Energy-efficient Edge-Robotic Applications

Committee: 

Dr. Raychowdhury, Advisor    

Dr. Datta, Chair

Dr. Romberg

Abstract: The objective of the proposed research is to develop event-based and spiking neural network (SNN) enabled systems for energy-efficient robotic applications. The SNN-assisted algorithms and circuits take advantage of the asynchronous and low-power operations to mitigate the low power availability and compute limitation issues for processing at the edge. We first explore the rhythmic leg movement by insect locomotion to map it on SNN to show an autonomous gait learning hexapod. The robot learns the gait pattern by taking the vision and vestibular sensory data like an animal would learn to walk without supervision. In the next step, the hexapod is connected to an event-based vision sensor as the sensory front-end. An SNN filter is designed to show the first spike-only closed-loop robotic platform. The second part of the work explores the limitation of SNN in reaching high levels of accuracy. The literature reveals that SNN and event-based camera capture minute temporal resolution required in high-speed applications whereas convolutional neural networks (CNN) with frame-based cameras capture spatial details required for high accuracy. We propose to demonstrate the fusion of complementary speed and accuracy prowess of SNN and CNN to mitigate the fundamental accuracy vs. latency trade-off in frame-based systems. We show a target tracking application intended for drones and it is validated using simulations in virtual environments, real-world demonstrations and a hybrid RRAM + SRAM test chip. Another application of hybrid vision for high-speed optical flow calculation is shown where we show a minor accuracy degradation with 4x speed up. For ongoing and future research, we propose to provide the measurement results on the current heterogenous test chip. We plan to build a size-constrained centimeter scale vision-enabled microrobot with onboard visual intelligence. Finally, we plan to build SoC optimized with a CNN-frontend and SNN-backend for neuromorphic 3D simultaneous localization and mapping (SLAM). To summarize, this work attempts to substitute and augment the compute-constrained platforms with SNN for energy saving and performance improvement.

Status

  • Workflow Status:Published
  • Created By:Daniela Staiculescu
  • Created:10/31/2022
  • Modified By:Daniela Staiculescu
  • Modified:10/31/2022

Categories

Target Audience