WALL-E Weekly Log

From ESE205 Wiki
Jump to navigation Jump to search


September 16

We met with our TA Andrew and completed the project proposal! We figured out our original Bluetooth option wasn't great, so we are now looking at a Bluetooth shield instead. We also found out we would have to power our robot, so we included batteries in our budget.

September 23

We revised our first project proposal to have a direct application and objective. Instead of just roaming around avoiding obstacles listlessly, our project's goal is to create a robot that can lead a blind person while avoiding obstacles. Synchronizing a group of sensors to work together, creating code to lead a person and creating a helpful app to communicate to the follower are new updated challenges we will face. We updated our Gantt chart to reflect this and assigned tasks to each project member.

September 30

This week we further worked on the project page, fixing both budget and reference errors. We met with our TA and went over some of the code which used pulse width modulation to send different voltages to speed/stop the motors on the RC car. Further researched how to use the HRC sensors, import the library and methods to call the sensors in Arduino. Also studied on how to connect and use the Bluetooth shield which actually comes with a pre-built app already which could turn out to be a useful starting point for our own app.

October 7

Our parts came in! We only ordered one distance sensor though as we want to spend some more time researching I2C protocol given Humberto's advice. I2C appears to be the standard way of communicating between lots of devices, where our Arduino would be the master and all the sensors would act as slaves with unique serial addresses. The cons of I2C are the sensors are more expensive, I'm having a difficult time finding reasonably priced I2C sensors given the amount we need for our project. More research will have to go into this. Another option might be to add more physical connectors to the Arduino to work with the cheaper HRC sensors, but this means we lose out on all the benefits of I2C. I researched some information on the k-means clustering algorithm which we will be using to control the direction our robot should go. It appears as if we want to use the total distance we read from each sensor into a list of k vectors and divide these into S subdivisions to minimize the within-cluster sum of squares. Novi researched how the bluetooth would communicate with the app and also assembled the chasis and motors, however she noticed the top portion was broken a bit and used a hot glue gun to fix it. Finally, we attached our motor shield to our Arduino and ran some basic tests to get the motors to spin. We also created a simple code sketch to measure distances with our sensor. With both of these functions working, we created a final set of code to turn the robot left when it reaches a wall.

October 14

This week we reconsidered our objective and decided to change it to help blind people avoid running into obstacles. We also decided to use a speaker in the robot to alert people instead of using phone App. We listed all the detailed works we need to do including all possible methods and programs. Then we edited our Gantt Chart and redivided the individual's work. We also worked on the I2C which combine two Arduino together in order to get more space for our sensors. We also get the gyroscope from our TA and discussed about how to set sensors. We decided to 3D print two components for the front and back sensors. Then the sensors can set in a stable way. Started to write some of the motor functions for the car. Wrote the code to send integers over I2C as standard protocol is sending only one byte at a time(ints are two bytes). Finally, read some raw values from the MPU6050.

October 21

Midterms were this week so we were not able to get as much work done on the project. Daniel was able to attach a distance sensor to the slave Arduino and relay the information over I2C to the main master Arduino. Following the same protocol we should be able to attach the remaining sensors, manage the wires and have a clean set up with most of the hardware challenges completed. Novi created a 3D model for stabilizing sensors. And she also wrote the basic move forward, backward, turn left and turn right switch cases for motor. When we figure out how the data changes in those situations, the cases can be applied to our robot.

October 28

This week we print out our mount for the sensors with the help of TA. However, one part of the mount overlap the place of screws in our robot car. So we need to adjust our model a little bit and print another one. Daniel get the second sensor working. Right now the two sensors can collect data at the same time(the lab is lacking female to male wire pins to connect all the sensors). Hopefully we are able to make all six sensors working next week and the begin to manage the data. Currently, Daniel has finally managed to get the IMU and the slave Arduino to both talk to the master Arduino at reasonable rates. The Digital Motion Product(DMP) on the IMU is set up trying to calculate accurate yaw pitch and roll values as fast as possible and relay them or "talk" all those to the master Arduino immediately. However, this doesn't give the slave Arduino time to talk to the master, and coding both together resulted in lots of FIFO errors. The solution is to send any data and calculations while the DMP is waiting for the next interrupt(this is a very short time), therefore there can be no delays in the code (instead we will have to manage timing with delta timing methods). Daniel also started to write the code to keep the car moving in a straight line: the principle being if the yaw angle from the IMU is past a certain degree e.g. plus or minus 15(more testing needs to be done when sensors, buses and Arduinos' are fixed securely on car)the condition for turn left or right will be turned on and go through Novi's switch statement for the motors and adjust back to 0 degrees.

November 4

All the sensors have been attached, IMU is functioning. We are close to having everything fixed and stable on the car, however we have had a couple errors 3D printing out the casing to hold the sensors. Further progresses on creating the first working basic algorithm(basic as in is there an object there? If so move appropriately) utilizing all the motor functions, turning at the correct angle, dodging for a fixed amount of time and turning back to the front to continue forwards.


November 11

Further progress on a long algorithm (made of if's and else statements) to control the car (simply one big switch statement depending on what the sensors and IMU are reading). Having some trouble with inaccurate data from the IMU, after a certain amount of time the yaw angles become sporadic. Speakers came in, hooked one up and downloaded the necessary pitch libraries to play a C4 sharp tone.


November 18

More progress on the messy if and else statements controlling the car, trying to incorporate what the car should do with "odd" objects e.g. such as a long wall. Ended up writing the code to turn the raw gyroscope readings into angles based off another person's work with the IMU, but instead they were concerned with self-balancing a quadcopter. The reason to do this was because while the Digital Motion Processor would have been more efficient to use(does the calculations on the chip instead of wasting the Arduino's time dealing with it) Ivensense(the company that makes the chip) doesn't have any documentation on how to use their DMP. The DMP does read values and changes them to correct angles, but there doesn't appear to be any filtering going on. So the data is +- 5 degrees, jumping around and the yaw drift(the only angle that matters for the car to work properly) is significant, after a half a minute the data will be reading the 0 angle as +-30 degrees. I'm positive the DMP has some kind of Kalman filter on it, but it's proving difficult to find out how to access that function. So instead the Arduino is now handling this process, at a sensitivity of +-500 degrees/s and sampling at 250 Hz. But now we are going to have to filter the data to get accurate results. The problem with the IMU is gyroscopes are accurate, however only for a very short amount of time because they drift. From the quadcopter example, I learned you can make a complimentary filter using the acceleration on the IMU(the acceleration values work consistently over time, but are prone to noise e.g shaking so not accurate) with the gyroscope to get accurate pitch and roll angles over long periods of time. However, these angles don't matter for the car the yaw one is most important. To deal with yaw drift, a common solution is to use a magnetometer along with the gyroscope to subtract the continual drift. This IMU is only 6 degrees of freedom though(no magnetometer) so my simple solution right now is to find the least squares best fit line to approximate the drift based on data I got from the IMU remaining still for a while, then subtract that from the yaw to try and slow down that drift time. So right now the car is able to function for roughly 5 minutes before starting to notice some affects of yaw drift. Which is better, but not great for long periods of time, long term solutions will be to use a better filter or use a magnetometer too.

November 25

Thanksgiving break

December 2

Fixed the constant Arduino crashing, there was a bug where the 250hz sampling rate wasn't being applied so the Arduino was calculating too much and consequently crashed, however when an Arduino crashes it is completely 'silent'. No errors are thrown, so it's difficult to find the source of error. Implemented a simple flashing LED to make sure the Arduino doesn't crash, and if at least when it crashes there will be some physical feedback. The fundamental main code to drive the car was finished by Novi, and we ran some tests. The car excels in some areas(code working efficiently, can tackle some odder objects than just left or right such as a small object or long wall), lacking in others(not the most fluid movements, some blind areas on the back sensors), but it looks like it will be able to demo effectively! Right now the speaker is making a short alarm every 3 seconds, some final adjustments are to make the speaker sound off different pitches when turning left or right and work on the poster.


December 6th

We showcased our project. Batteries weren't working 20 minutes beforehand, were able to fix that in time.

December 13ths

Finalized project page