Trilateration in Robotic Sensing using Acoustic Sensors Intro

From ESE497 Wiki
Jump to navigationJump to search

<sidebar>Trilateration_in_Robotic_Sensing_using_Acoustic_Sensors_Nav</sidebar>

Introduction

Our objective is to construct an indoor positioning system to determine the coordinates of a device. To achieve this, a coordinate system has to be constructed within a building or room and some method of locating the device with respect to those coordinates must be developed.

Our algorithm constructs a coordinate geometry which contains four "anchor nodes" whose positions are known. We use the speed of sound and delay of signals (determined by cross-correlation) to determine the device's distance from each of those nodes. Those distances and the anchor coordinates can be processed (trilateration) to determine the device's approximate position.

Our immediate application for this technology is robot navigation. Our robot (or its user) can know the coordinates it occupies without "remembering" how it has moved in an inevitably flawed manner. This positioning system doesn't rely on previous location information. Confidently knowing where a robot is allows us to give it orders about where to move next or program an algorithm determining that for us.

Why Not GPS?

The technology we are developing resembles GPS in its use of Trilateration and one could carve out a local coordinate geometry from a global one. So are we re-inventing the wheel? The answer is no, as discussed in our abstract, GPS has a large region of ambiguity relative to the spaces we want to work in. The resolution of points is too low for our purposes.