Longboard sensor module log
This week we began physical planning and code/data planning. Additionally we created our BOM and Gant chart.
Physical planning: First where the parts will go in the box, what size the box should be, and where the box should be mounted on the board.
Code/Data Planning: We started planning how much data we were going to capture, where to store it, and how to interpret it. We also started looking into communication protocols.
We continued on our project planning. The majority of this was finalizing the parts we were going to use and made sure all of the parts were compatible with eachother. Another constraint for the parts was size. We tried to find both the smallest and most accurate (while still cheap) version of parts. That way we could ensure good data while also ensuring a small form factor on the board. The final (and possibly most important) constraint for the sensors was that they had to be compatible with the raspi. While writing our own library for the sensors is possible, it would take away from time we could spend making the final result of the project much better overall.
We've started developing the CAD model of all of the components to make a box that holds the sensors and the Raspi. This started with modeling all of the individual components that will go in the box and so that the CAD model can accurately be built around them. The model also has to be built taking into account the restrictions of 3D printers.
We've ordered all the components and have started working on learning the libraries necessary to communicate between the Raspi and the external sensors.
Additionally we've prepared our presentation for the class: https://uploadfiles.io/ibkna
Files are now backed up on Danny Andreev's git repo under a larger electric longboard project. All electrical components were CADed.
<a href="https://github.com/lolomolo/LongboardMarkII/tree/master/Ev%20Sensor%20Module">Git Repo</a>
The components are fully and accurately modeled in our CAD file for the box.
The code planning is almost complete and ready to be started for next week. We've decided we will take 3 samples/second from every sensor and store that in a local SQL DB. We are aware that some of the sensors will provide the equivalent of "null" data for some of those samples so this will allow us to "smooth" or average the data to ~1 data point/second for the user to view. Finally we will use a python library to chart the data locally on the raspi and view through HDMI. This will allow us to quickly iterate through different sample rates and smoothing algorithms and ultimately decide what data the user will have access to.