Difference between revisions of "Introduction"

From ESE497 Wiki
Jump to navigationJump to search
 
Line 1: Line 1:
 
<sidebar>Robotic Sensing: Adaptive Robotic Control for Improved Acoustic Source Localization in 2D Nav</sidebar>
 
<sidebar>Robotic Sensing: Adaptive Robotic Control for Improved Acoustic Source Localization in 2D Nav</sidebar>
 
The fundamental notion behind acoustic source localization is the concept that the angle of arrival of an incoming sound wave can be determined with a pair of microphones that are placed with a known distance apart from each other. A detailed explanation of this concept can be in found in [http://www.ese.wustl.edu/~nehorai/RaphaelZachary/students.cec.wustl.edu/_rms3/angle%20of%20arrival.htm our Fall 2009 web report]. Our project employs two pairs of microphones to estimate two independent angles of arrival of the incoming sound wave. These two estimations can be used to derive an actual position estimation of the sound source by simply solving for the intersection of the two angles of arrival. Our Fall 2009 report contains a further [http://www.ese.wustl.edu/~nehorai/RaphaelZachary/students.cec.wustl.edu/_rms3/calculating%20position.htm explanation and derivation] for this.
 
The fundamental notion behind acoustic source localization is the concept that the angle of arrival of an incoming sound wave can be determined with a pair of microphones that are placed with a known distance apart from each other. A detailed explanation of this concept can be in found in [http://www.ese.wustl.edu/~nehorai/RaphaelZachary/students.cec.wustl.edu/_rms3/angle%20of%20arrival.htm our Fall 2009 web report]. Our project employs two pairs of microphones to estimate two independent angles of arrival of the incoming sound wave. These two estimations can be used to derive an actual position estimation of the sound source by simply solving for the intersection of the two angles of arrival. Our Fall 2009 report contains a further [http://www.ese.wustl.edu/~nehorai/RaphaelZachary/students.cec.wustl.edu/_rms3/calculating%20position.htm explanation and derivation] for this.
 
  
 
Given a particular microphone array geometry and sampling frequency, there are a finite number of possible locations which can be chosen as source estimations. The resolution of the system's localization depends on the density of this set of possible location points surrounding the sound source. As a result of the fact that this set of points is not uniformly distributed, the resolution around the sound source depends on the orientation of the microphone pairs. With this in mind, the goal of our project has been to mount the microphone pairs on robots in order to change the physical parameters of the system in real time. In this way, our localization system is designed to adaptively alter microphone positioning in order to optimize resolution around a sound source. (reference?)
 
Given a particular microphone array geometry and sampling frequency, there are a finite number of possible locations which can be chosen as source estimations. The resolution of the system's localization depends on the density of this set of possible location points surrounding the sound source. As a result of the fact that this set of points is not uniformly distributed, the resolution around the sound source depends on the orientation of the microphone pairs. With this in mind, the goal of our project has been to mount the microphone pairs on robots in order to change the physical parameters of the system in real time. In this way, our localization system is designed to adaptively alter microphone positioning in order to optimize resolution around a sound source. (reference?)
 
  
 
To design this system LabVIEW was used as the primary programming language. The motivation for using LabVIEW was that it offers a number of distinct advantages in the integration of various tasks which a dynamic system would need to run simultaneously in order function effectively. To run an acoustic source localization system in real time, incorporating useful robotic movement of the sensors, requires the integration and synchronization of signal waveform generation, sound output functionality, data acquisition, signal processing, plotting of the source estimation, and constructive commands for robot movement based on the localization, all occurring in parallel with one another. These parallel processes motivated us to use a dataflow programming environment, which inherently deals with individual processing tasks in a parallel manner. LabVIEW's capability of interfacing with a wide array of hardware and other software applications was very useful. Aspects of the code including signal processing, cross correlation, plotting and visualization tools as well as the localization algorithm itself are all written in Matlab <nowiki>'.m'</nowiki> files which are called upon constantly by LabVIEW. More details about our LabVIEW code as well dataflow programming can be found under [http://www.ese.wustl.edu/~nehorai/RaphaelZachary/students.cec.wustl.edu/_rms3/Recent%20Progress.htm <nowiki>"Recent Progress"</nowiki> of our Fall 2009 report].
 
To design this system LabVIEW was used as the primary programming language. The motivation for using LabVIEW was that it offers a number of distinct advantages in the integration of various tasks which a dynamic system would need to run simultaneously in order function effectively. To run an acoustic source localization system in real time, incorporating useful robotic movement of the sensors, requires the integration and synchronization of signal waveform generation, sound output functionality, data acquisition, signal processing, plotting of the source estimation, and constructive commands for robot movement based on the localization, all occurring in parallel with one another. These parallel processes motivated us to use a dataflow programming environment, which inherently deals with individual processing tasks in a parallel manner. LabVIEW's capability of interfacing with a wide array of hardware and other software applications was very useful. Aspects of the code including signal processing, cross correlation, plotting and visualization tools as well as the localization algorithm itself are all written in Matlab <nowiki>'.m'</nowiki> files which are called upon constantly by LabVIEW. More details about our LabVIEW code as well dataflow programming can be found under [http://www.ese.wustl.edu/~nehorai/RaphaelZachary/students.cec.wustl.edu/_rms3/Recent%20Progress.htm <nowiki>"Recent Progress"</nowiki> of our Fall 2009 report].
 
  
 
Additionally, details of the user interface designed for our data acquisition, signal generation, synchronization, processing, and plotting aspects of our system ca be found under [http://www.ese.wustl.edu/~nehorai/RaphaelZachary/students.cec.wustl.edu/_rms3/Data%20acquisition.htm LabVIEW Environment].
 
Additionally, details of the user interface designed for our data acquisition, signal generation, synchronization, processing, and plotting aspects of our system ca be found under [http://www.ese.wustl.edu/~nehorai/RaphaelZachary/students.cec.wustl.edu/_rms3/Data%20acquisition.htm LabVIEW Environment].

Latest revision as of 05:55, 30 April 2010

<sidebar>Robotic Sensing: Adaptive Robotic Control for Improved Acoustic Source Localization in 2D Nav</sidebar> The fundamental notion behind acoustic source localization is the concept that the angle of arrival of an incoming sound wave can be determined with a pair of microphones that are placed with a known distance apart from each other. A detailed explanation of this concept can be in found in our Fall 2009 web report. Our project employs two pairs of microphones to estimate two independent angles of arrival of the incoming sound wave. These two estimations can be used to derive an actual position estimation of the sound source by simply solving for the intersection of the two angles of arrival. Our Fall 2009 report contains a further explanation and derivation for this.

Given a particular microphone array geometry and sampling frequency, there are a finite number of possible locations which can be chosen as source estimations. The resolution of the system's localization depends on the density of this set of possible location points surrounding the sound source. As a result of the fact that this set of points is not uniformly distributed, the resolution around the sound source depends on the orientation of the microphone pairs. With this in mind, the goal of our project has been to mount the microphone pairs on robots in order to change the physical parameters of the system in real time. In this way, our localization system is designed to adaptively alter microphone positioning in order to optimize resolution around a sound source. (reference?)

To design this system LabVIEW was used as the primary programming language. The motivation for using LabVIEW was that it offers a number of distinct advantages in the integration of various tasks which a dynamic system would need to run simultaneously in order function effectively. To run an acoustic source localization system in real time, incorporating useful robotic movement of the sensors, requires the integration and synchronization of signal waveform generation, sound output functionality, data acquisition, signal processing, plotting of the source estimation, and constructive commands for robot movement based on the localization, all occurring in parallel with one another. These parallel processes motivated us to use a dataflow programming environment, which inherently deals with individual processing tasks in a parallel manner. LabVIEW's capability of interfacing with a wide array of hardware and other software applications was very useful. Aspects of the code including signal processing, cross correlation, plotting and visualization tools as well as the localization algorithm itself are all written in Matlab '.m' files which are called upon constantly by LabVIEW. More details about our LabVIEW code as well dataflow programming can be found under "Recent Progress" of our Fall 2009 report.

Additionally, details of the user interface designed for our data acquisition, signal generation, synchronization, processing, and plotting aspects of our system ca be found under LabVIEW Environment.