event

Ph.D. Thesis Proposal: Changhyun Choi

Primary tabs

Title: Model-based Pose Estimation and Tracking using 2D or 3D Visual Information

 

Changhyun Choi

Robotics Ph.D. Program

School of Interactive Computing

College of Computing

Georgia Institute of Technology

 

Date: Tuesday, July 3rd, 2012

Time: 2:00pm - 4:00pm

Location:  Marcus Nano building, Room 1118

 

Committee:

  • Prof. Henrik I. Christensen (Advisor), School of Interactive Computing, Georgia Institute of Technology
  • Prof. James M. Rehg, School of Interactive Computing, Georgia Institute of Technology
  • Prof. Irfan Essa, School of Interactive Computing, Georgia Institute of Technology
  • Prof. Anthony Yezzi, School of Electrical and Computer Engineering, Georgia Institute of Technology
  • Prof. Dieter Fox, Computer Science and Engineering, University of Washington

 

Summary:

As robotic systems are moving from well controlled environments to unstructured environments, they are required to operate in highly dynamic and cluttered scenes. Finding an object, estimating its pose, and tracking the pose over time in these scenes are challenging problems. Although various approaches have tackled these problems, their scope of objects and robustness of their solutions were still limited. In this proposal, we are to exploit both photometric and geometric features of objects to increase the scope of objects and to improve the robustness of the 3D object pose estimation and tracking. We are also to combine the pose estimation and tracking in a systematic way to realize a fully automatic object pose tracking system.

 To this end, we present a monocular visual tracking using edge and keypoint features in a particle filtering framework. Whereas most of the related work is under an assumption that an initial object pose is given, we employ keypoint features for initialization of the filter. After the initialization, our particle filter seeks the true pose trajectories of the object over time. Edge features are employed to evaluate the likelihood of each pose hypothesis with respect to the current image measurement. For this evaluation, a set of visible edges from a CAD model is automatically determined offline. As a dynamics model, we employ a first-order autoregressive state dynamics for better tracking performance that propagates particles more effectively than Gaussian random walk models. For a fully automatic tracking system, our system re-initializes its tracking in case of losing the tracked object. While the keypoint-based initialization is suitable for textured objects, it is not generally applicable to textureless objects. To address this problem, we present an object pose estimation for textureless objects, for which an efficient chamfer matching is employed.

 As real-time 3D depth sensors were recently introduced, it is an natural idea to exploit the additional reliable depth data. We propose to extend our approach using a 2D monocular camera to incorporate the 3D depth information. First, we present an object pose estimation algorithm using a voting scheme. Several pair features are proposed for two object classes: color point pair feature for daily objects and a set of boundary pair features for industrial objects. A set of possible pair features in each object is calculated and saved in a hash table for determining correspondences between object and scene features. With the learned hash table, our approach estimates highly likely pose hypotheses through the efficient voting scheme. Second, we propose to extend our particle filter-based tracking to exploit the 3D depth data and to combine the tracking with the voting-based pose estimation. To this extension, we list some of remaining issues needed to address and propose possible solutions for these issues. Experimental protocols are then presented, followed by the timeline.

Status

  • Workflow Status:Published
  • Created By:Jupiter
  • Created:06/28/2012
  • Modified By:Fletcher Moore
  • Modified:10/07/2016

Categories

  • No categories were selected.

Keywords

  • No keywords were submitted.