Current systems differentiate between navigation sensors and payload sensors. Navigation sensors are specifically for collecting measurements to position an AUV. These observations are processed in real-time using a variety of perception algorithms. In contrast, payload instruments collect data for future processing. Powerful instruments such a multibeam sonar, high quality still cameras, etc. are used to collect high resolution data about the environment, but this information is not used in real-time.
Many projects are seeking to alleviate this divide between payload and navigation sensors. Vision based algorithms promise to leverage the optical images to constrain the unbounded error growth for underwater applications (Huster & Rock, 2003) (Eustice, Pizarro, & Singh, 2004). Similarly, combining course navigation with bathymetry can serve to improve both the positioning and final data product (Roman & Singh, 2006). Many researchers have developed estimation techniques that make use of the bathymetry. These terrain based methods make use of either a fathometer or bathymetric sonar to position the vehicle relative to a known (or partially unknown) map of the seafloor (Tuohy, Leonard, Bellingham, Patrikalakis, & Chryssostomidis, 1996) (Williams, Dissanayake, & Durrant-Whyte, 1999). Each of these techniques offers a path toward crossing the artificial divide between payload sensors and navigation aids.
Was this article helpful?