event

Ph.D. Thesis Proposal by Gabriel Reyes

Primary tabs

Ph.D. Thesis Proposal Announcement

Title: Enabling Kinesthetic Mobile Input Using Wearable Devices

Gabriel Reyes
Ph.D. Student
School of Interactive Computing
College of Computing
Georgia Institute of Technology
http://www.gareyes.com

Date: Wednesday, April 8, 2015
Time: 3:00PM – 5:00PM EST
Location: TSRB 223

Committee:

Dr. W. Keith Edwards (Co-Advisor), School of Interactive Computing, Georgia Institute of Technology
Dr. Gregory D. Abowd (Co-Advisor), School of Interactive Computing, Georgia Institute of Technology
Dr. Thad Starner, School of Interactive Computing, Georgia Institute of Technology
Dr. John Stasko, School of Interactive Computing, Georgia Institute of Technology
Dr. Omer Inan, School of Electrical & Computer Engineering, Georgia Institute of Technology
Dr. Shwetak Patel, Computer Science & Engineering, University of Washington

Abstract:

A new evolution of computing is emerging around wearable technologies. Wearable computing has been a topic of research for years. However, we are beginning to see adoption by consumers and non-researchers due to advances in embedded and mobile software systems, low-power microprocessor design, wireless technologies, and low-cost sensors. There are still a number of open research challenges in wearable computing, from providing continuous battery power, simplifying on-body networking, addressing privacy and social issues, to designing the human-computer interface. Traditional desktop and mobile input technologies, such as mice, keyboards, and touchscreens are no longer suitable for wearable computing scenarios. In its most common embodiment, wearable computing today relies on a very restricted set of input and output modalities, making this an exciting research area with opportunities for innovation.

The goal of my work is to envision new user experiences and enhance the richness and quality of input modalities available to mobile and wearable computer systems. In this proposal, I articulate an alternative approach to interaction with computing systems that is specifically focused on wearable, kinesthetic input devices. Kinesthesia is the human sense that provides a person with awareness of their own body position (proprioception) and their spatial orientation (vestibular system). I take advantage of this phenomenon in the design of two wearable interactive systems that leverage the dexterity of the hands and fingers for gestural interactions without visual attention. I present on-going and future work in the design, implementation, and evaluation of: (1) a gloveless, inertial-based, wearable device that senses finger and wrist movements to enable kinesthetic input, and (2) a wearable device built using a small form factor, wrist-worn depth camera to detect gestural input on and above the back of the hand and forearm.

The kinesthetic interfaces detailed in this proposal will contribute insights on sensing techniques and interactions to the mobile input and wearable computing research communities. Through a process of sensor explorations, rapid prototyping, system evaluation and empirical studies, I will demonstrate it is feasible to build a set of wearable devices that take advantage of novel on-body sensing and the human anatomy to enable robust and reliable kinesthetic input. I plan to demonstrate that the set of wearable devices I have built may be used without direct visual attention, are quickly accessible, operate in near real-time, and are available when needed. More importantly, these wearable devices enable users to provide useful input, in the form of various discrete and continuous gestures, to a number of applications, including text input, system controls, games, and virtual reality environments.

Status

  • Workflow Status:Published
  • Created By:Tatianna Richardson
  • Created:04/02/2015
  • Modified By:Fletcher Moore
  • Modified:10/07/2016

Categories

Target Audience