{"689109":{"#nid":"689109","#data":{"type":"event","title":"Ph.D. Dissertation Defense - Nealson Li","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003ETitle\u003C\/strong\u003E\u003Cem\u003E:\u0026nbsp; Eye Tracking and Gaze Estimation with Event Camera for Extended Reality\u003C\/em\u003E\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ECommittee:\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EDr.\u0026nbsp;Arijit Raychowdhury, ECE, Chair, Advisor\u003C\/p\u003E\u003Cp\u003EDr.\u0026nbsp;Visvesh Sathe, ECE\u003C\/p\u003E\u003Cp\u003EDr.\u0026nbsp;Tushar Krishna, ECE\u003C\/p\u003E\u003Cp\u003EDr.\u0026nbsp;Souvik Kundu, Intel\u003C\/p\u003E\u003Cp\u003EDr.\u0026nbsp;Celine Lin, CoC\u003C\/p\u003E","summary":"","format":"limited_html"}],"field_subtitle":"","field_summary":[{"value":"\u003Cp\u003EExtended Reality (XR) headsets are the next-generation mobile platform with applications that range from education to healthcare. To build a lightweight system that provides a real-time immersive experience all day, improvements in performance and power efficiency are required. One of the crucial functional blocks in an XR system is eye tracking which enables gaze estimation for foveated rendering and more. An emerging RGB camera alternative is the event camera that has the advantage of movement-triggered, high temporal resolution, and sparse data format that better suits the eye-tracking task. However, exploiting the advantages while maintaining accuracy is a challenge. We analyzed eye features through event-based data and developed methodologies for real-time gaze estimation that operates exclusively on event-based data. A system that utilizes recurrent neural networks (RNN) and an angular loss function is introduced, achieving a precision of 0.46\u00b0 and a frequency of 950 Hz. Additionally, an event-to-frame conversion technique and a U-Net Convolutional Neural Network (CNN) are proposed for pupil classification. A region of interest (RoI) mechanism is integrated to robustly track the pupil\u2019s location, optimizing computational efficiency. The resulting eye-tracking pipeline locates the pupil with an error of 3.68 pixels while operating at a low system power of 160 mW. Building on these algorithmic insights, we introduce Event-Gaze, an eye-tracking and gaze-estimation SoC that natively processes asynchronous, sparse event-camera data. Using an algorithm-hardware co-design approach, the system performs graph construction from the event stream and leverages dedicated Graph Neural Network (GNN) and Transformer accelerators to enable eye tracking and gaze estimation, respectively. Fabricated in 28nm CMOS, the SoC achieves 1.12\u00b0 accuracy with 441\u03bcs latency while consuming 12.88mW. Compared to baseline solutions, Event-Gaze demonstrates a 1.8\u00d7 latency reduction and a 15% improvement in system power efficiency, facilitating low-latency, low-power event-camera-based gaze prediction for next-generation XR applications.\u003C\/p\u003E","format":"limited_html"}],"field_summary_sentence":[{"value":"Eye Tracking and Gaze Estimation with Event Camera for Extended Reality "}],"uid":"28475","created_gmt":"2026-03-20 20:18:31","changed_gmt":"2026-03-20 20:19:49","author":"Daniela Staiculescu","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2026-04-01T13:00:00-04:00","event_time_end":"2026-04-01T15:00:00-04:00","event_time_end_last":"2026-04-01T15:00:00-04:00","gmt_time_start":"2026-04-01 17:00:00","gmt_time_end":"2026-04-01 19:00:00","gmt_time_end_last":"2026-04-01 19:00:00","rrule":null,"timezone":"America\/New_York"},"location":"Online","extras":[],"related_links":[{"url":"https:\/\/teams.microsoft.com\/l\/meetup-join\/19%3ameeting_OWE0ZGRhNDItZjQ5OC00MzlkLWJjYmEtNzQ1NjcxZDlhY2E5%40thread.v2\/0?context=%7b%22Tid%22%3a%22482198bb-ae7b-4b25-8b7a-6d7f32faa083%22%2c%22Oid%22%3a%22bd7e1212-2d0e-43f1-a241-e796ac426fff%22%7d","title":"Microsoft Teams Meeting link"}],"groups":[{"id":"434381","name":"ECE Ph.D. Dissertation Defenses"}],"categories":[],"keywords":[{"id":"100811","name":"Phd Defense"},{"id":"1808","name":"graduate students"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1788","name":"Other\/Miscellaneous"}],"invited_audience":[{"id":"78771","name":"Public"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}