Eye Tracking for ALS Patients

From ESE497 Wiki
Jump to navigationJump to search

<sidebar>Trilateration_in_Robotic_Sensing_using_Acoustic_Sensors_Nav</sidebar>

Background

This research was conducted by Sana Naghipour and Saba Naghipour in the Fall 2011 Semester at Washington University in Saint Louis. It was part of the Undergraduate Research Program and taken for credit as the course ESE 497 under the Electrical and Systems Engineering Department. The project was overseen by Dr. Arye Nehorai,Ed Richter and Phani Chavali.

Acknowledgments

We would like to show our greatest appreciation to Prof. Nehorai and Ed Richtor for providing us the research opportunity and feedbacks for the successful completion of the research.

We are highly indebted to our mentor, Phani Chaveli for his guidance and constant supervision as well as for providing necessary information regarding the project.

Project Overview

Abstract: The use of global positioning systems (GPS) for accurately locating targets of interest has become ubiquitous over the last decade. However, contrary to the intuition, the localization becomes more difficult when the source is located within a smaller region, for example, inside a building. This phenomenon is the result of the slow processing time of the computers relative to the time taken by the light to travel back and forth from the satellite to the source. The aim of this research is to study the localization using trilateration methods employing acoustics sources. Since the speed of propagation of sound is less than the speed of propagation of light, we expect to obtain better resolution with this setup. We mount a microphone on a robot which can move in 2-D plane and track its position by measuring the signals that the microphone records from four speakers whose position is known. We later extend this setup to perform trilateration using other existing infrastructure like WLAN.


Introduction

System Setup

Tracking

Results

Conclusions and Future Work