{"545371":{"#nid":"545371","#data":{"type":"event","title":"PhD Proposal by Yin Li","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003ETitle: Learning Embodied Models of Actions from First Person Video\u003C\/strong\u003E\u003Cbr \/\u003E \u003Cbr \/\u003E \u003Cstrong\u003EYin Li\u003C\/strong\u003E\u003Cbr \/\u003E Computer Science Ph.D. Student\u003Cbr \/\u003E School of Interactive Computing\u003Cbr \/\u003E College of Computing\u003Cbr \/\u003E Georgia Institute of Technology\u003Cbr \/\u003E \u003Cbr \/\u003E Date: Monday, June 20th, 2016\u003Cbr \/\u003E Time:\u0026nbsp;1:00pm to 3:00pm (EST)\u003Cbr \/\u003E LocationTSRB GVU Cafe\u003Cbr \/\u003E\u0026nbsp;\u003Cbr \/\u003E\u003Cstrong\u003E Committee:\u003C\/strong\u003E\u003Cbr \/\u003E ---------------\u003Cbr \/\u003E Dr. James M. Rehg\u0026nbsp;(Advisor), School of Interactive Computing, Georgia Institute of Technology\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EDr. Irfan Essa, School of Interactive Computing, Georgia Institute of Technology\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EDr. James Hays, School of Interactive Computing, Georgia Institute of Technology\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EDr. Kristen Grauman, Department of Computer Science, University of Texas at Austin\u003Cbr \/\u003E \u003Cbr \/\u003E Abstract:\u003Cbr \/\u003E -----------\u003C\/p\u003E\u003Cp\u003EThe development of wearable cameras and the advancement of computer vision make it possible for the first time in history to collect and analyze a large scale record of our daily visual experiences, in the form of first person videos. My thesis work focuses on the automatic analysis of these first person videos, known as First Person Vision (FPV). My goal is to develop novel embodied representations for understanding the camera wearer\u0027s actions, by leveraging first person visual cues derived from first person videos, including body motion, hand locations and gaze. This ``embodied\u0027\u0027 representation is different from traditional visual representations, as it derives from the purposive body movements of the first person and captures the concept of objects within the context of actions.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EBy considering actions as intentional body movements, I propose to investigate three important parts of first person actions. First, I present a method to estimate egocentric gaze that reveal the visual trajectory of an action. Our work demonstrates for the first time that egocentric gaze can be reliably estimated using only head motion and hand locations derived from first person video, and without the need of object or action information. Second, I develop a method for first person action recognition. Our work demonstrates that an embodied representation that combines egocentric cues and visual cues can inform the location of actions and significantly improve the accuracy of recognition. Finally, I propose a novel task of object interaction prediction to uncover the plan of a future object manipulation and thus explain the purposive motions. I will develop novel learning schemes for the task and learn a embodied object representation from the task.\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"Learning Embodied Models of Actions from First Person Video"}],"uid":"27707","created_gmt":"2016-06-16 12:57:21","changed_gmt":"2016-10-08 02:18:07","author":"Tatianna Richardson","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2016-06-20T14:00:00-04:00","event_time_end":"2016-06-20T16:00:00-04:00","event_time_end_last":"2016-06-20T16:00:00-04:00","gmt_time_start":"2016-06-20 18:00:00","gmt_time_end":"2016-06-20 20:00:00","gmt_time_end_last":"2016-06-20 20:00:00","rrule":null,"timezone":"America\/New_York"},"extras":[],"groups":[{"id":"221981","name":"Graduate Studies"}],"categories":[],"keywords":[{"id":"102851","name":"Phd proposal"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1788","name":"Other\/Miscellaneous"}],"invited_audience":[{"id":"78771","name":"Public"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}