Difference between revisions of "Robotic Microphone Array Notes SP2010"

From ESE497 Wiki
Jump to navigationJump to search
 
(One intermediate revision by one other user not shown)
Line 41: Line 41:
 
*Week 9
 
*Week 9
 
**Symposium!
 
**Symposium!
[[File:system1.jpg|200px|thumb|left|alt text]]
+
 
 +
Test
 +
Tyring to embed [[File:system1.jpg|20px|thumb|left|alt text]] in text.
 +
Test
  
 
[[File:wave_field.gif|200px|thumb|left|alt text]]
 
[[File:wave_field.gif|200px|thumb|left|alt text]]

Latest revision as of 19:07, 27 April 2010

  • Team Members: Zach, Rafael
  • PhD Supervisor: Phani
  • Faculty Supervisor: Arye Nehorai
  • Goal:
    • Complete and test the simulations when a new dimension of 'rotation' is introduced in the current system.
    • Check the hardware requirements to build and test the new system.
    • Learn how the current control algorithm works and how it can be modified when the robots can rotate.
    • Implement the new control algorithm in Matlab and integrate it with the hardware.
    • Make a plan about how the robot position can be tracked using IR sensors and how the system should be modified to make it 3D.

  • Status
  • Week 1
    • The simulations of the additional 'rotation' dimension are functional.
  • Week 2
    • The current robotic platform can be modified into a crude robot capable of rotating, but will require a number of experiments and calculations to be usable in a demo, will be highly limited by the microphone cabling, and will likely be prone to error. Thus, it was decided to use our current platform as a proof of concept and to use the new platform to implement what we learn.
  • Week 3
    • Modified the existing robot setup to allow it the extra degree of freedom so that it can rotate freely. The rotation was tested and seems reasonably consistent for our current purposes. The next steps involve mapping out the geometry of how the robot's rotation affects the microphone position and incorporating this information in the system code, so that the system can accurately keep track of microphone position while rotating.
  • Week 4
    • Set up code to map controller algorithm's commands for desired robot position, given in X and Y coordinates, into a sequence of forward, backward and rotational robotic movements.
    • Determined the way in which robotic rotation to a particular angle affects the microphone midpoint in terms of x and y coordinates.
  • Week 5
    • Formalized a technique to better represent the effective resolution of the source localization. This was done by expressing the average distance between the point of source localization and all of the neighboring potential points of localization.
    • Discussed how the decisions involved with adaptive movement would benefit from an algorithm with a deterministic approach in which each of the possible point of localization would be shifted to the current estimated position through a simulation. The resulting resolution for each shift would then be calculated and the necessary robot placement for this shift would be determined. In this way, more educated decisions could possibly be made for adaptive movement.
  • Week 6
    • Wrote Matlab code to calculate a better and more generalized effective resolution of the source localization.
    • Drafted a block diagram model to describe our current approach for developing an adaptive algorithm to be used for 2D movement. In this approach simulations will be used to determine the potential resolution improvement after moving the robots in each of the five degrees of freedom which the system has. For each degree of freedom a number of simulations will be run for different amounts of movement and the simulation resulting in the most resolution improvement will be chosen as the next robotic movement.
    • "Memory full" error in simulations has been fixed.
  • Week 7
    • Controller Framework finished.
  • Week 8
    • Controller 25% Operational, major errors in case 1 have been resolved.
  • Week 9
    • Symposium!

Test

Tyring to embed

alt text

in text.

Test

alt text