{"229821":{"#nid":"229821","#data":{"type":"event","title":"GVU Brown Bag Seminar: Yang Li","body":[{"value":"\u003Cp\u003E\u003Cstrong\u003ESpeaker:\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EYang Li\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003ETitle:\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EOptimistic Interactive Computing\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EAbstract:\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EUsers tend to optimistically assume a computer system will perform adequately in spite of the fact that their input is ambiguous, incomplete or even unexpected by the system. In this talk, I argue that instead of viewing this optimism as a problem for users to avoid, we should strive to design and build interactive systems to fulfill it. To illustrate this point, I will discuss a set of mobile computing projects that approach this goal.\u003C\/p\u003E\u003Cp\u003EFirst, I will discuss several systems that make complex tasks simple by utilizing novel input modalities such as touchscreen gestures and smartphone cameras. These solutions either are heavily inspired by user intuition or dynamically evolve based on user behaviors and expectations.\u0026nbsp;Second, I will discuss how we enable developers to easily program touch input that is inherently ambiguous and complex, by using examples and concepts they are familiar with.\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003EBio:\u0026nbsp;\u003C\/strong\u003E\u003C\/p\u003E\u003Cp\u003EYang is a Senior Research Scientist at Google (\u003Ca href=\u0022http:\/\/yangl.org\u0022 title=\u0022http:\/\/yangl.org\u0022\u003Ehttp:\/\/yangl.org\u003C\/a\u003E), and an affiliate faculty member in Computer Science \u0026amp; Engineering at the University of Washington. He earned a Ph.D. degree in Computer Science from the Chinese Academy of Sciences, and then conducted postdoctoral research in EECS at the University of California at Berkeley. Yang is broadly interested in Human-Computer Interaction, especially mobile computing, interaction design tools and gesture-based interaction. His work (Gesture Search and Library) has been used by millions of users. Gesture Search can be found at \u003Ca href=\u0022https:\/\/play.google.com\/store\/apps\/details?id=com.google.android.apps.gesturesearch\u0026amp;hl=en\u0022 title=\u0022https:\/\/play.google.com\/store\/apps\/details?id=com.google.android.apps.gesturesearch\u0026amp;hl=en\u0022\u003Ehttps:\/\/play.google.com\/store\/apps\/details?id=com.google.android.apps.ge...\u003C\/a\u003E.\u003C\/p\u003E\u003Cp\u003E\u0026nbsp;\u003C\/p\u003E\u003Cp\u003E\u003Cstrong\u003E\u003Cbr \/\u003E\u003C\/strong\u003E\u003C\/p\u003E","summary":null,"format":"limited_html"}],"field_subtitle":"","field_summary":"","field_summary_sentence":[{"value":"Optimistic Interactive Computing"}],"uid":"27774","created_gmt":"2013-08-19 14:51:49","changed_gmt":"2016-10-08 02:04:21","author":"Alishia Farr","boilerplate_text":"","field_publication":"","field_article_url":"","field_event_time":{"event_time_start":"2013-11-14T10:30:00-05:00","event_time_end":"2013-11-14T12:00:00-05:00","event_time_end_last":"2013-11-14T12:00:00-05:00","gmt_time_start":"2013-11-14 15:30:00","gmt_time_end":"2013-11-14 17:00:00","gmt_time_end_last":"2013-11-14 17:00:00","rrule":null,"timezone":"America\/New_York"},"extras":["free_food"],"hg_media":{"248621":{"id":"248621","type":"image","title":"Yang Li","body":null,"created":"1449243772","gmt_created":"2015-12-04 15:42:52","changed":"1475894926","gmt_changed":"2016-10-08 02:48:46","alt":"Yang Li","file":{"fid":"198026","name":"portrait_big.jpg","image_path":"\/sites\/default\/files\/images\/portrait_big_0.jpg","image_full_path":"http:\/\/hg.gatech.edu\/\/sites\/default\/files\/images\/portrait_big_0.jpg","mime":"image\/jpeg","size":257031,"path_740":"http:\/\/hg.gatech.edu\/sites\/default\/files\/styles\/740xx_scale\/public\/images\/portrait_big_0.jpg?itok=pm4RnHIN"}}},"media_ids":["248621"],"groups":[{"id":"1299","name":"GVU Center"}],"categories":[],"keywords":[{"id":"4096","name":"brown bag"},{"id":"1946","name":"GVU"}],"core_research_areas":[],"news_room_topics":[],"event_categories":[{"id":"1795","name":"Seminar\/Lecture\/Colloquium"}],"invited_audience":[],"affiliations":[],"classification":[],"areas_of_expertise":[],"news_and_recent_appearances":[],"phone":[],"contact":[],"email":[],"slides":[],"orientation":[],"userdata":""}}}