Trilateration in Robotic Sensing using Acoustic Sensors Positioning System

From ESE497 Wiki
Revision as of 03:07, 21 December 2010 by JoeEisner (talk | contribs) (→‎Positioning System)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

<sidebar>Trilateration_in_Robotic_Sensing_using_Acoustic_Sensors_Nav</sidebar>

Positioning System

While the geometry of the anchor nodes can take many forms, for simplicity all of our experiments were conducted with a basic rectangular shape.

At each anchor node we place a speaker which emits a chirp with 1kHz bandwidth. Each chirp occupies a distinct frequency range. The device we are trying to locate is fitted with a microphone.

The signal from the microphone contains each of the chirps and background noise all superimposed. We filter the signal four times, once for each chirp, and compare that filtered signal (which still contains some noise) to the corresponding chirp. To determine the time elapsed between the speaker emitting the chirp and the microphone picking it up we use a mathematical method called cross correlation.

Multiplying that time by the speed of sound we determine the approximate distance between the speaker and microphone. Once we know the distance between the microphone and each speaker we trilaterate the microphone's position.

Positioning System1.JPG