PhD Proposal by Chen Zhang

Event Details
  • Date/Time:
    • Thursday August 17, 2017
      1:00 pm - 3:00 pm
  • Location: GVU Cafe, 2nd floor on TSRB
  • Phone:
  • URL:
  • Email:
  • Fee(s):
    N/A
  • Extras:
Contact
No contact information submitted.
Summaries

Summary Sentence: Design and implementation of novel gestures for wearables

Full Summary: No summary paragraph submitted.

Title: Design and implementation of novel gestures for wearables

 

Date:Thursday , August 17th, 2017

Time: 1PM - 3 PM

Location: GVU Cafe, 2nd floor on TSRB

 

Committee:

-------------------------------------------------

Dr. Gregory D. Abowd (Co-Advisor, School of Interactive Computing, Georgia Tech)

Dr. Omer Inan (Co-Advisor, School of Electrical and Computer Engineering, Georgia Tech)

Dr. Thad Starner (School of Interactive Computing, Georgia Tech)

Dr. Thomas Ploetz (School of Interactive Computing, Georgia Tech)

Dr. Chris Harrison (Human Computer Interaction Institute, Carnegie Mellon University)

 

Abstract:

--------------------------------------------------

Wearable computing is an inevitable part of the next generation of computing. Compared with traditional computers (e.g., laptop, smartphones), wearable devices are much smaller, creating new challenges for the design of both hardware and software. Providing appropriate input capabilities for wearables is one such challenge. The input techniques that have been proven efficient on traditional computing devices (e.g., keyboard, touchscreen) are no longer appropriate for wearables due to various reasons. One is the inherently small size of wearables. For instance, it is impossible to place a physical keyboard on a wearable device. Most of the commodity wearable devices such as the smartwatch and Google Glass adopt a touch-based input solution, which suffers from the small operation area as well as the limited richness of input vocabulary. The other reason is the more dynamic working environment the wearables are exposed to. For instance, wearable devices are expected to be functional and efficient even when the user is in motion (e.g., walking). Traditional input devices are no longer appropriate to address these challenges. Compared with input on the physical keyboard or touchscreen, gesture-based input provides the user with much larger freedom of operation, which can potentially improve the interaction experience. 

 

In this thesis, I explore designing and implementing various novel gestures to address the input challenges on wearables: from using the built-in sensors of an off-the-shelf device to building customized hardware; from 2D on-body interaction to 3D input with a larger freedom of operation; from recognizing pre-defined and discrete hand or finger gestures with machine learning to providing continuous input tracking through a deeper understanding of physics.

 

I start with exploring the natural and novel input gestures that can be supported by using only the built-in sensors of a smartwatch. I describe WatchOut and TapSkin, which allow user input on the watch case, band, and the skin around the watch. Though using only the built-in sensors is more practical, the richness of input gestures and performance of recognition are not optimized because of the limited choice of sensors. 

 

To better address the input challenge of wearable, I designed and implemented another sets of wearable input techniques using customized hardware (e.g., a thumb-mounted ring), which provides new input gestures for wearable that are not available on a commodity device, such as,input number digits, Graffiti-style characters, menu selection and quick response with the protection of users' privacy. However, these complementary gestures can only partially improve the interaction experience. To fundamentally address the input challenge on wearables, new interaction paradigms are needed to replace the touch-based on-device interaction. 

 

Such an interaction paradigm is usually comprised of various low-level input events of high resolution. For instance, the press and release of the mouse keys are the low-level input events for the WIMP interface. This leads to the last section of my dissertation work: providing low-level input events for wearables, including consciously tracking of the finger position in 3D space around the wearable, the estimation of thumb position on the skin, as well as the touch-pressure of the finger on the skin. I also demonstrate how such low-level input events influence the design of the user experience for wearables. 

 

Additional Information

In Campus Calendar
No
Groups

Graduate Studies

Invited Audience
Graduate students
Categories
Other/Miscellaneous
Keywords
Phd proposal
Status
  • Created By: Tatianna Richardson
  • Workflow Status: Published
  • Created On: Aug 16, 2017 - 8:30am
  • Last Updated: Aug 16, 2017 - 8:30am