{"537791":{"#nid":"537791","#data":{"type":"event","title":"PhD Defense by Aman Parnami","body":[{"value":"\u003Cp\u003ETitle: Enabling Motion-Based Gestural Interaction Design\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EAman Parnami\u003C\/p\u003E\u003Cp\u003EPh.D. Student\u003C\/p\u003E\u003Cp\u003ESchool of Interactive Computing\u003C\/p\u003E\u003Cp\u003ECollege of Computing\u003C\/p\u003E\u003Cp\u003EGeorgia Institute of Technology\u003C\/p\u003E\u003Cp\u003E\u003Ca href=\u0022http:\/\/amanparnami.com\/\u0022\u003Ehttp:\/\/amanparnami.com\u003C\/a\u003E\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EDate: Thursday, May 26th, 2016\u003C\/p\u003E\u003Cp\u003ETime: 2:00PM - 4:00PM EST\u003C\/p\u003E\u003Cp\u003ELocation: TSRB 223\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003ECommittee:\u003C\/p\u003E\u003Cp\u003E-------------------\u003C\/p\u003E\u003Cp\u003EDr. Gregory D. Abowd, School of Interactive Computing, Georgia Tech\u003C\/p\u003E\u003Cp\u003EDr. Betsy DiSalvo,\u0026nbsp;School of Interactive Computing, Georgia Tech\u003C\/p\u003E\u003Cp\u003EDr. Thad Starner,\u0026nbsp;School of Interactive Computing, Georgia Tech\u003C\/p\u003E\u003Cp\u003EDr. W. Keith Edwards,\u0026nbsp;School of Interactive Computing, Georgia Tech\u003C\/p\u003E\u003Cp\u003EDr. Bj\u00f6rn Hartmann,\u0026nbsp;Electrical Engineering \u0026amp; Computer Science, UC Berkeley\u003C\/p\u003E\u003Cp\u003EDr. Yang Li, Google Research\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EAbstract:\u003C\/p\u003E\u003Cp\u003E------------------\u003C\/p\u003E\u003Cp\u003EGestural input is inherently quick and expressive since a single motion can indicate the operation, the operand, and additional parameters. For instance, in the case of a smart watch, a simple turn of the wrist (gesture) quickly towards the eyes (parameters), expresses the wearer\u0027s intent to check the time (operation). As a result, the screen of the smart watch (operand) turns on to reveal the time. Motion-based gestures are particularly useful in mobile situations where visual attention is limited, or when brief interactions with wearable devices are needed. Smartphones and wearable devices can support motion-based gestures through now ubiquitous inertial sensors (e.g., accelerometers, gyroscopes). Despite ubiquitous support and proposed advantages, they are not as common as touchscreen input, particularly among interaction designers.\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EThere are three primary challenges for interaction designers pertaining to gestural interaction design. First, a lack of pattern recognition expertise required for creation of gesture recognizers. Second, the absence of tools that support an end-to-end design process for gestural interaction. Third, the divide between the development and usage environments because traditional tools limit prototyping to controlled environments.\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003EIn this proposal, I describe a mobile motion-based gestural interaction design tool that supports in situ, context-aware prototyping. The proposed tool will enable rapid creation of novel and accurate gestures, and novel gestural interfaces for wearables. Furthermore, I present a plan for iteratively designing, developing, and evaluating this tool. With this tool, my hope is to empower interaction designers in their explorations of novel interfaces for emerging wearable and mobile platforms.\u003C\/p\u003E\u003Cp\u003E \u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"Enabling Motion-Based Gestural Interaction Design"}],"uid":"27707","created_gmt":"2016-05-19 15:36:58","changed_gmt":"2016-10-08 02:17:51","author":"Tatianna Richardson","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2016-05-26T15:00:00-04:00","event_time_end":"2016-05-26T17:00:00-04:00","event_time_end_last":"2016-05-26T17:00:00-04:00","gmt_time_start":"2016-05-26 19:00:00","gmt_time_end":"2016-05-26 21:00:00","gmt_time_end_last":"2016-05-26 21:00:00","rrule":null,"timezone":"America\/New_York"},"extras":[],"groups":[{"id":"221981","name":"Graduate Studies"}],"categories":[],"keywords":[{"id":"100811","name":"Phd Defense"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1788","name":"Other\/Miscellaneous"}],"invited_audience":[{"id":"78771","name":"Public"}],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}