Difference between revisions of "Eye Tracking for ALS Patients"

From ESE497 Wiki
Jump to navigationJump to search
Line 1: Line 1:
The Eye tracker project is a research effort to enable people, who are
+
<sidebar>Trilateration_in_Robotic_Sensing_using_Acoustic_Sensors_Nav</sidebar>
suffering from Amyotrophic Lateral Sclerosis (ALS), to write using their
+
===Background===
eyes by tracking the movement of the pupil. The project will be implemented
+
This research was conducted by Chuck Holmes and Joseph Eisner in the Fall 2010 Semester at Washington University in Saint Louis. It was part of the Undergraduate Research Program and taken for credit as the course ESE 497 under the Electrical and Systems Engineering Department. The project was overseen by Dr. Arye Nehorai, Patricio La Rosa, and Ed Richter.
in two main phases:
 
  
First Phase  : development of the software for pupil tracking
+
===Acknowledgments===
 +
We would like to thank the following people who helped us on this project:
  
Second Phase :  building the hardware necessary to capture the images of the
+
Dr. Nehorai for putting together this program and giving us the opportunity to work with physical systems and learn from all the failures that involves.
eye and transfer the images to a processing unit
 
  
First Phase: In this phase of the project, we focus on software development.
+
Patricio La Rosa for his direction, insight, and patience.
We will use an infrared camera to capture the images of a human eye using
 
Labview.  We employ template matching methods to locate the center of the
 
pupil, where we will use a small patch of dark pixels as a template. We will
 
propose an adaptive search methods, in which we choose the search space in
 
each frame based on the estimates of the pupil location from the previous
 
frames. The proposed adaptive search will greatly reduces the computational
 
complexity of the algorithm, which is essential for the real time tracking.
 
We will also work on developing hybrid methods that use template matching
 
along with feature selection to develop robust and computationally
 
inexpensive algorithms that are insensitive to noise in the images and
 
orientation of the camera that is used to obtain the images.
 
  
Second Phase: In the second phase of the project, we will build the hardware
+
Ed Richter for being able and willing to troubleshoot and solve any problem.
system, that can record the images using a camera mounted on a pair of
+
 
sunglasses. We will port the software that we develop during the first phase
+
Phani Chavali for supporting us and loaning us his laptop (which was thankfully '''not''' stolen).
on to a suitable microcontroller, which does the processing and generates
+
 
the control signal which can be used to move the prosthetic limbs. We will
+
Raphael Schwartz, Zachary Knudsen, and Andrew Wiens for their web reports which helped us structure this one.
also develop hardware to transmit the recorded images to the
+
 
micro-controller wirelessly.
+
Joshua York for his project which we reference.
 +
 
 +
===Project Overview===
 +
Abstract: The use of global positioning systems (GPS) for accurately locating targets of interest has become ubiquitous over the last decade. However, contrary to the intuition, the localization becomes more difficult when the source is located within a smaller region, for example, inside a building. This phenomenon is the result of the slow processing time of the computers relative to the time taken by the light to travel back and forth from the satellite to the source. The aim of this research is to study the localization using trilateration methods employing acoustics sources.  Since the speed of propagation of sound is less than the speed of propagation of light, we expect to obtain better resolution with this setup. We mount a microphone on a robot which can move in 2-D plane and track its position by measuring the signals that the microphone records from four speakers whose position is known. We later extend this setup to perform trilateration using other existing infrastructure like WLAN.
 +
 
 +
 
 +
[[Trilateration_in_Robotic_Sensing_using_Acoustic_Sensors_Intro | Introduction]]
 +
 
 +
[[Trilateration_in_Robotic_Sensing_using_Acoustic_Sensors_Positioning_System | Positioning System]]
 +
 
 +
[[Trilateration_in_Robotic_Sensing_Study_Notes_FL2010_CrossCorrelation(what is) | Range Estimation]]
 +
 
 +
[[Trilateration_in_Robotic_Sensing_Study_Notes_FL2010_Trilateration(what is) | Trilateration]]
 +
 
 +
[[Trilateration_in_Robotic_Sensing_using_Acoustic_Sensors_Errors| Sources of Error]]
 +
 
 +
[[Trilateration_in_Robotic_Sensing_using_Acoustic_Experimental_Setup | Experimental Setup]]
 +
 
 +
[[Trilateration_in_Robotic_Sensing_using_Acoustic_Conclusions | Conclusions and Future Work]]

Revision as of 23:26, 18 December 2011

<sidebar>Trilateration_in_Robotic_Sensing_using_Acoustic_Sensors_Nav</sidebar>

Background

This research was conducted by Chuck Holmes and Joseph Eisner in the Fall 2010 Semester at Washington University in Saint Louis. It was part of the Undergraduate Research Program and taken for credit as the course ESE 497 under the Electrical and Systems Engineering Department. The project was overseen by Dr. Arye Nehorai, Patricio La Rosa, and Ed Richter.

Acknowledgments

We would like to thank the following people who helped us on this project:

Dr. Nehorai for putting together this program and giving us the opportunity to work with physical systems and learn from all the failures that involves.

Patricio La Rosa for his direction, insight, and patience.

Ed Richter for being able and willing to troubleshoot and solve any problem.

Phani Chavali for supporting us and loaning us his laptop (which was thankfully not stolen).

Raphael Schwartz, Zachary Knudsen, and Andrew Wiens for their web reports which helped us structure this one.

Joshua York for his project which we reference.

Project Overview

Abstract: The use of global positioning systems (GPS) for accurately locating targets of interest has become ubiquitous over the last decade. However, contrary to the intuition, the localization becomes more difficult when the source is located within a smaller region, for example, inside a building. This phenomenon is the result of the slow processing time of the computers relative to the time taken by the light to travel back and forth from the satellite to the source. The aim of this research is to study the localization using trilateration methods employing acoustics sources. Since the speed of propagation of sound is less than the speed of propagation of light, we expect to obtain better resolution with this setup. We mount a microphone on a robot which can move in 2-D plane and track its position by measuring the signals that the microphone records from four speakers whose position is known. We later extend this setup to perform trilateration using other existing infrastructure like WLAN.


Introduction

Positioning System

Range Estimation

Trilateration

Sources of Error

Experimental Setup

Conclusions and Future Work