Skip to main content - access key m.
Skip to main navigation - access key n.

Wednesday, Sept. 22, 2010 11 a.m., TI Auditorium
(ECSS 2.102)

 

 

 

 

 

 

 

 

 

 

 EE seminar series

“Fusion of Vision and Inertial Measurement Units for Mobile Navigation”
Dr. Nicholas Gans, UT Dallas
Sponsored by the Dallas Chapter of the IEEE Signal Processing Society

Abstract
Localization is a critical problem for building mobile robotic systems capable of autonomous navigation. I will present a novel visual odometry method to improve the accuracy of localization when a camera is viewing a piecewise planar scene. Discrete and continuous homography matrices are used to recover position, heading and velocity from images of coplanar feature points. A Kalman filter is used to fuse pose and velocity estimates, along with measurements from an inertial measurement unit and robot wheel encoders. Simulation and experimental results are presented to demonstrate the performance of the proposed method.

Bio
An assistant professor of electrical engineering at UT Dallas, Nicholas Gans received his PhD in systems and entrepreneurial engineering from the University of Illinois at Urbana-Champaign in 2005. His research interests include nonlinear and adaptive control, with a focus on vision-based control and estimation, robotics and autonomous vehicles. He is also involved in developing visualization and virtual reality platforms for accurate simulation of vision-based controllers. Prior to joining UT Dallas, he worked as a postdoctoral researcher with the Mechanical and Aerospace Engineering Department at the University of Florida and as a postdoctoral associate with the National Research Council, where he conducted research on the control of autonomous aircraft for the Air Force Research Laboratory Munitions Directorate and developed the Visualization Laboratory for simulation of vision-based control systems. He has published over 40 peer-reviewed conference and journal papers, and he holds two patents in these areas.