event

PhD Defense by Christopher R. McBryde

Primary tabs

Doctoral Defense

by

Christopher R. McBryde

Advisor: Prof. E. Glenn Lightsey

SPACECRAFT VISUAL NAVIGATION USING APPEARANCE MATCHING AND MULTI-SPECTRAL SENSOR FUSION

May 3, 2018, 3:30 pm, Montgomery Knight 317

One of the capabilities necessary for a successful satellite mission is knowledge of its location and orientation in space, especially relative to a target. Relative navigation is an enabling technology for spacecraft formation flying, rendezvous and docking, and hazard avoidance. Cameras are particularly useful for this task since they are less expensive, smaller, and have lower power requirements than many other types of sensors. Object identification and relative pose estimation is therefore a key topic of research for the future of spacecraft. Using cameras for object identification and relative pose estimation presents a few challenges. Obtaining relative position and orientation data is a two-step process. An object must first be identified so that the image data can provide a meaningful relative pose. Historically, the complete relative navigation process has involved two different algorithms, one for object identification and another for pose estimation, working in tandem. Finally, images in the visible spectrum are susceptible to variations in illumination that affects the perceived shape of the object, if it can be imaged at all.

 

The approach taken in this research is to apply terrestrial techniques to improve spacecraft navigation. First, appearance matching is used as a common framework for both object identification and pose estimation and is made more robust using background randomization. Consequently, a spacecraft imaging simulation environment is created to both generate the necessary training images as well as verify the systems performance. Additionally, results for multiple sensors are fused to improve the identification and pose estimation as well as increase the operating range over more of the orbit. The result of this research is that a robust method is demonstrated for object identification and pose estimation of a spacecraft target. A single framework accomplishes both tasks and may be further enhanced using multiple sensors. Appearance matching and sensor fusion will help enable the next generation of spacecraft visual navigation.

 

Committee Members:

Dr. Glenn Lightsey, School of Aerospace Engineering (advisor)

Dr. Marcus Holzinger, School of Aerospace Engineering

Dr. Eric Johnson, School of Aerospace Engineering

Mr. Chad Frost, Ames Research Center

Dr. Andrew Johnson, Jet Propulsion Laboratory

Status

  • Workflow Status:Published
  • Created By:Tatianna Richardson
  • Created:04/16/2018
  • Modified By:Tatianna Richardson
  • Modified:04/16/2018

Categories

Keywords