Trilateration in Robotic Sensing using Acoustic Sensors
<sidebar>Trilateration_in_Robotic_Sensing_using_Acoustic_Sensors_Nav</sidebar>
Background
This research was conducted by Chuck Holmes and Joseph Eisner in the Fall 2010 Semester at Washington University in Saint Louis. The project was overseen by Dr. Arye Nehorai, Patricio La Rosa, and Ed Richter.
Acknowledgments
We would like to thank the following people who helped us on this project:
Dr. Nehorai for putting together this program and giving us the opportunity to work with physical systems and learn from all the failures that involves.
Patricio La Rosa for his direction, insight, and patience.
Ed Richter for being able and willing to troubleshoot and solve any problem.
Phani Chavali for supporting us and loaning us his laptop (which was thankfully not stolen).
Raphael Schwartz, Zachary Knudsen, and Andrew Wiens for their web reports which helped us structure this one.
Joshua York for his project which we reference.
Project Overview
Abstract: The use of global positioning systems (GPS) for accurately locating targets of interest has become ubiquitous over the last decade. However, contrary to the intuition, the localization becomes more difficult when the source is located within a smaller region, for example, inside a building. This phenomenon is the result of the slow processing time of the computers relative to the time taken by the light to travel back and forth from the satellite to the source. The aim of this research is to study the localization using trilateration methods employing acoustics sources. Since the speed of propagation of sound is less than the speed of propagation of light, we expect to obtain better resolution with this setup. We mount a microphone on a robot which can move in 2-D plane and track its position by measuring the signals that the microphone records from four speakers whose position is known. We later extend this setup to perform trilateration using other existing infrastructure like WLAN.