event

PhD Proposal by Aman Parnami

Primary tabs

Title: Enabling Motion-Based Gestural Interaction Design

 

Aman Parnami

Ph.D. Student

School of Interactive Computing

College of Computing

Georgia Institute of Technology

http://amanparnami.com

 

Date: Thursday, May 26th, 2016

Time: 2:00PM - 4:00PM EST

Location: TSRB 223

 

Committee:

-------------------

Dr. Gregory D. Abowd, School of Interactive Computing, Georgia Tech

Dr. Betsy DiSalvo, School of Interactive Computing, Georgia Tech

Dr. Thad Starner, School of Interactive Computing, Georgia Tech

Dr. W. Keith Edwards, School of Interactive Computing, Georgia Tech

Dr. Björn Hartmann, Electrical Engineering & Computer Science, UC Berkeley

Dr. Yang Li, Google Research

 

Abstract:

------------------

Gestural input is inherently quick and expressive since a single motion can indicate the operation, the operand, and additional parameters. For instance, in the case of a smart watch, a simple turn of the wrist (gesture) quickly towards the eyes (parameters), expresses the wearer's intent to check the time (operation). As a result, the screen of the smart watch (operand) turns on to reveal the time. Motion-based gestures are particularly useful in mobile situations where visual attention is limited, or when brief interactions with wearable devices are needed. Smartphones and wearable devices can support motion-based gestures through now ubiquitous inertial sensors (e.g., accelerometers, gyroscopes). Despite ubiquitous support and proposed advantages, they are not as common as touchscreen input, particularly among interaction designers.

 

There are three primary challenges for interaction designers pertaining to gestural interaction design. First, a lack of pattern recognition expertise required for creation of gesture recognizers. Second, the absence of tools that support an end-to-end design process for gestural interaction. Third, the divide between the development and usage environments because traditional tools limit prototyping to controlled environments.

 

In this proposal, I describe a mobile motion-based gestural interaction design tool that supports in situ, context-aware prototyping. The proposed tool will enable rapid creation of novel and accurate gestures, and novel gestural interfaces for wearables. Furthermore, I present a plan for iteratively designing, developing, and evaluating this tool. With this tool, my hope is to empower interaction designers in their explorations of novel interfaces for emerging wearable and mobile platforms.

Status

  • Workflow Status:Published
  • Created By:Tatianna Richardson
  • Created:05/24/2016
  • Modified By:Fletcher Moore
  • Modified:10/07/2016

Categories

Keywords

Target Audience