event

Ph.D. Thesis Proposal: Daehyung Park

Primary tabs

Assistive robots have the potential to serve as caregivers, assisting with activities of daily living (ADLs) and instrumental activities of daily living (IADLs). Detecting when something has gone wrong could help assistive robots operate more safely and effectively around people. However, the complexity of interacting with people and objects in human environments can make errors difficult to detect. We introduce a multimodal execution monitoring system to detect and classify anomalous executions when robots operate near humans. The system’s anomaly detector models multimodal sensory signals with a hidden Markov model (HMM) and uses a likelihood threshold that varies based on the progress of task execution. The system classifies the type and cause of common anomalies using an artificial neural network. We evaluate my system with haptic, visual, auditory, and kinematic sensing during household tasks and human-robot interactive tasks (feeding assistance) performed by a PR2 robot with able-bodied participants and people with disabilities. In our evaluation, our methods performed better than other methods from the literature, yielding higher area under curve (AUC) and shorter detection delays. Multimodality also improved the performance of monitoring methods by detecting a broader range of anomalies.

 

Groups

Status

  • Workflow Status:Published
  • Created By:Kristen Bailey
  • Created:06/02/2017
  • Modified By:Kristen Bailey
  • Modified:06/02/2017

Keywords

  • No keywords were submitted.