Difference between revisions of "WALL-E Weekly Log"

From ESE205 Wiki
Jump to navigation Jump to search
Line 15: Line 15:
  
 
==October 14==
 
==October 14==
This week we reconsidered our objective and decided to change it to help blind people avoid running into obstacles. We also decided to use a speaker in the robot to alert people instead of using phone App. We listed all the detailed works we need to do including all possible methods and programs. Then we edited our Gantt Chart and redivided the individual's work. We also worked on the I2C which combine two Arduino together in order to get more space for our sensors.
+
This week we reconsidered our objective and decided to change it to help blind people avoid running into obstacles. We also decided to use a speaker in the robot to alert people instead of using phone App. We listed all the detailed works we need to do including all possible methods and programs. Then we edited our Gantt Chart and redivided the individual's work. We also worked on the I2C which combine two Arduino together in order to get more space for our sensors. We also get the gyroscope from our TA and discussed about how to set sensors. We decided to 3D print two components for the front and back sensors. Then the sensors can set in a stable way.
  
 
==October 21==
 
==October 21==

Revision as of 16:42, 14 October 2016


September 16

We met with our TA Andrew and completed the project proposal! We figured out our original Bluetooth option wasn't great, so we are now looking at a Bluetooth shield instead. We also found out we would have to power our robot, so we included batteries in our budget.

September 23

We revised our first project proposal to have a direct application and objective. Instead of just roaming around avoiding obstacles listlessly, our project's goal is to create a robot that can lead a blind person while avoiding obstacles. Synchronizing a group of sensors to work together, creating code to lead a person and creating a helpful app to communicate to the follower are new updated challenges we will face. We updated our Gantt chart to reflect this and assigned tasks to each project member.

September 30

This week we further worked on the project page, fixing both budget and reference errors. We met with our TA and went over some of the code which used pulse width modulation to send different voltages to speed/stop the motors on the RC car. Further researched how to use the HRC sensors, import the library and methods to call the sensors in Arduino. Also studied on how to connect and use the Bluetooth shield which actually comes with a pre-built app already which could turn out to be a useful starting point for our own app.

October 7

Our parts came in! We only ordered one distance sensor though as we want to spend some more time researching I2C protocol given Humberto's advice. I2C appears to be the standard way of communicating between lots of devices, where our Arduino would be the master and all the sensors would act as slaves with unique serial addresses. The cons of I2C are the sensors are more expensive, I'm having a difficult time finding reasonably priced I2C sensors given the amount we need for our project. More research will have to go into this. Another option might be to add more physical connectors to the Arduino to work with the cheaper HRC sensors, but this means we lose out on all the benefits of I2C. I researched some information on the k-means clustering algorithm which we will be using to control the direction our robot should go. It appears as if we want to use the total distance we read from each sensor into a list of k vectors and divide these into S subdivisions to minimize the within-cluster sum of squares. Novi researched how the bluetooth would communicate with the app and also assembled the chasis and motors, however she noticed the top portion was broken a bit and used a hot glue gun to fix it. Finally, we attached our motor shield to our Arduino and ran some basic tests to get the motors to spin. We also created a simple code sketch to measure distances with our sensor. With both of these functions working, we created a final set of code to turn the robot left when it reaches a wall.

October 14

This week we reconsidered our objective and decided to change it to help blind people avoid running into obstacles. We also decided to use a speaker in the robot to alert people instead of using phone App. We listed all the detailed works we need to do including all possible methods and programs. Then we edited our Gantt Chart and redivided the individual's work. We also worked on the I2C which combine two Arduino together in order to get more space for our sensors. We also get the gyroscope from our TA and discussed about how to set sensors. We decided to 3D print two components for the front and back sensors. Then the sensors can set in a stable way.

October 21

October 28