event

PhD Defense by Gabriel Reyes

Primary tabs

Title: Enabling One-Handed Input for Wearable Computing

Gabriel Reyes
Ph.D. Candidate
School of Interactive Computing
College of Computing
Georgia Institute of Technology
http://www.gareyes.com

Date: Monday, April 24th, 2017
Time: 2:30PM – 5:30PM EST
Location: TSRB 2nd Floor @ GVU Cafe

Committee:

Dr. W. Keith Edwards (Co-Advisor), School of Interactive Computing, Georgia Institute of Technology
Dr. Gregory D. Abowd (Co-Advisor), School of Interactive Computing, Georgia Institute of Technology
Dr. Thad E. Starner, School of Interactive Computing, Georgia Institute of Technology
Dr. John T. Stasko, School of Interactive Computing, Georgia Institute of Technology
Dr. Omer T. Inan, School of Electrical & Computer Engineering, Georgia Institute of Technology
Dr. Kent Lyons, Los Altos Research Center, Technicolor Research

Abstract:

A new evolution of computing is emerging around wearable technologies. Wearable computing has been a topic of research for years. However, we are beginning to see adoption by consumers and non-researchers due to advances in embedded and mobile software systems, low-power microprocessor design, wireless technologies, and low-cost sensors. There are still a number of open research challenges in wearable computing, from providing continuous battery power, simplifying on-body networking, addressing privacy and social issues, to designing the interaction experience. Traditional desktop and mobile input technologies, such as mice, keyboards, and in some cases touchscreens are no longer suitable for wearable computing scenarios. In its most common embodiment, wearable computing today relies on a very restricted set of input and output modalities, making this an exciting research area with opportunities for innovation.

The goal of my work is to envision new user experiences and enhance the richness and quality of input modalities available to mobile and wearable computer systems. In this dissertation, I articulate an alternative approach to interaction with computing systems that is specifically focused on wearable, one-handed input techniques. I utilize the smartwatch as a platform of choice for sensing and computation. However, these techniques may also be embedded into other types of wrist-worn devices such as bracelets or fitness bands. The interaction techniques I describe in this dissertation are designed purposefully to eliminate the need to directly interact with the on-wrist device. I take advantage of the dexterity of the arm, hand and fingers around the device for gestural interactions. I also leverage the malleability of humans' vocal resonant system to produce non-voice acoustic sounds as input when bringing the device close to the mouth.

In summary, I present research work in the design, implementation, and evaluation of three systems: (1) an interaction technique that allows a person to control their smartwatch through non-voice acoustics detected using the device microphone and machine learning; (2) a one-handed interaction technique that tracks the synchronous and rhythmic extension and reposition of the user's thumb (augmented with a passive magnetic ring) through correlation with on-screen blinking controls and without requiring calibration; and (3) a gloveless, inertial-based technique that combines the smartwatch with sensors mounted on the thumb to sense wrist and thumb movements and enable a broad set of finger-level gestures.

Status

  • Workflow Status:Published
  • Created By:Tatianna Richardson
  • Created:04/17/2017
  • Modified By:Tatianna Richardson
  • Modified:04/17/2017

Categories

Keywords

Target Audience