Difference between revisions of "BusyBear"

From ESE205 Wiki
Jump to navigation Jump to search
m (Protected "BusyBear" ([Edit=Allow only administrators] (indefinite) [Move=Allow only administrators] (indefinite)))
 
(136 intermediate revisions by 3 users not shown)
Line 14: Line 14:
 
[[https://gowustl-my.sharepoint.com/:p:/g/personal/allisontodd_wustl_edu/ESFY4a6JoHBJgPZhPsdUbF4BAx6-jF78K3qQ9vP8iXzWMA?e=GokLoW Project Presentation]] <br/>
 
[[https://gowustl-my.sharepoint.com/:p:/g/personal/allisontodd_wustl_edu/ESFY4a6JoHBJgPZhPsdUbF4BAx6-jF78K3qQ9vP8iXzWMA?e=GokLoW Project Presentation]] <br/>
 
[[https://github.com/ESE205/emerson-goon-todd GitHub Repository]]<br/>
 
[[https://github.com/ESE205/emerson-goon-todd GitHub Repository]]<br/>
[[https://classes.engineering.wustl.edu/ese205/core/index.php?title=Network_Adapter_in_Monitoring_Mode_-_Rasp_Pi_3B Network Adapter Monitoring Mode Tutorial]]
+
[[https://classes.engineering.wustl.edu/ese205/core/index.php?title=Network_Adapter_in_Monitoring_Mode_-_Rasp_Pi_3B Network Adapter Monitoring Mode Tutorial]]<br>
 +
[[https://gowustl-my.sharepoint.com/:p:/g/personal/allisontodd_wustl_edu/ET0SagydcXRPmjek74mGNmsB35HGL4CsJE-wYw5s4xRF_g?e=yfV4g6 Poster]]
  
 
==Objectives==
 
==Objectives==
 
* Learn and be able to code in Python as related to the Pi
 
* Learn and be able to code in Python as related to the Pi
* Investigate ping oriented tracking method
+
* Use sniffing/MAC tracking method in the analysis of busyness
* Investigate sniffing/MAC tracking method
 
 
* Investigate the use of the camera in the analysis of busyness
 
* Investigate the use of the camera in the analysis of busyness
 
* Be able to monitor busyness fairly accurately by filtering detected devices
 
* Be able to monitor busyness fairly accurately by filtering detected devices
Line 25: Line 25:
 
* Design a GUI for an aesthetically pleasing and useful website
 
* Design a GUI for an aesthetically pleasing and useful website
 
* Host a website displaying useful and relevant data through Amazon Web Services (AWS)
 
* Host a website displaying useful and relevant data through Amazon Web Services (AWS)
* Create a database to store data and use through AWS
 
  
 
==Challenges==
 
==Challenges==
Line 49: Line 48:
 
| 1 x Alfa AWUSO36NH High Gain USB Wireless G/N Long-Range WiFi Network Adapter || Network Adapter || $31.99 || https://www.amazon.com/Alfa-AWUSO36NH-Wireless-Long-Rang-Network/dp/B0035APGP6/ref=sr_1_1_sspa?keywords=alfa+network+adapter&qid=1553045771&s=gateway&sr=8-1-spons&psc=1
 
| 1 x Alfa AWUSO36NH High Gain USB Wireless G/N Long-Range WiFi Network Adapter || Network Adapter || $31.99 || https://www.amazon.com/Alfa-AWUSO36NH-Wireless-Long-Rang-Network/dp/B0035APGP6/ref=sr_1_1_sspa?keywords=alfa+network+adapter&qid=1553045771&s=gateway&sr=8-1-spons&psc=1
 
|-
 
|-
| mybusybear.com || Domain Name || $12.00 || [[File:domain price cropped.jpg|thumb|100px]]
+
| mybusybear.com || Domain Name || $12.00 || [[DomainPrice.jpg]]
 
|-
 
|-
 
|Total Cost||  || $71.20 ||
 
|Total Cost||  || $71.20 ||
Line 55: Line 54:
  
 
=Design and Solutions=
 
=Design and Solutions=
==Tentative Website Layout==
+
[[File:basicIdea.png|250px|Basic Idea]]
[[File:WebsiteFlowchart.png|600px|thumb|left|Website Design Flowchart]] <br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>
+
<br>
 +
http://www.mybusybear.com
 +
==Build the Device==
 +
We began by constructing a device to collect MAC addresses. Initially, we hoped that with the RaspberryPi's WiFi capabilities, we could simply use the base hardware for detection. We quickly determined that the RaspberryPi was not capable of entering a monitoring mode<ref>Pi Network - [https://makezine.com/projects/build-raspberry-pi-network-scanner/]</ref>; we would need external hardware to serve this purpose. We went through a variety of external network adapters, and ultimately found one with both monitoring mode capabilities and compatibility with the RaspberryPi <ref>Network Adapter - [https://www.amazon.com/Alfa-AWUSO36NH-Wireless-Long-Rang-Network/dp/B0035APGP6/ref=sr_1_1_sspa?keywords=alfa+network+adapter&qid=1553045771&s=gateway&sr=8-1-spons&psc=1]</ref>. Using the Network Adapter's functionality will be explored further in the Collect Information section and in the Network Adapter in Monitoring Mode tutorial<ref> Network Adapter in Monitoring Mode Tutorial - [https://classes.engineering.wustl.edu/ese205/core/index.php?title=Network_Adapter_in_Monitoring_Mode_-_Rasp_Pi_3B]</ref>.<br>
 +
We decided that a RaspberryPi camera should be added to the device to strengthen the validity of the data gathered from the network adapter. The Pi Camera is fairly simple to connect and the functionality is implemented through Pi commands<ref>Pi Camera - [https://www.youtube.com/watch?v=T8T6S5eFpqE] </ref>. By analyzing a combination of the number of addresses collected and the visual busyness found in the picture, more accurate trends over time can be determined.
 +
<br>
 +
This is our final project:
 +
<br>
 +
[[File:Device.jpeg|500px]]
 +
 
 +
==Collect Information==
 +
In the before mentioned tutorial [[https://classes.engineering.wustl.edu/ese205/core/index.php?title=Network_Adapter_in_Monitoring_Mode_-_Rasp_Pi_3B Network Adapter Monitoring Mode Tutorial]] we established how setup the network adapter in monitoring mode and install kismet, the software we used to utilize monitoring mode. Once properly configured, simply calling kismet spews out text into the console as so:
 +
 
 +
<source lang = "bash">
 +
INFO: 802.11 Wi-Fi device 00:A7:42:FC:6E:03 advertising SSID
 +
      'wustl-guest-2.0'
 +
INFO: Detected new 802.11 Wi-Fi access point 00:A7:42:FC:6E:00
 +
INFO: 802.11 Wi-Fi device 00:A7:42:FC:6E:00 advertising SSID 'eduroam'
 +
INFO: Detected new 802.11 Wi-Fi device 74:B5:87:C6:90:1E
 +
INFO: Detected new 802.11 Wi-Fi device 00:08:E3:FF:FD:EC
 +
 
 +
</source>
 +
 
 +
We could collect data, but we needed a way to be able to consolidate the MAC Addresses, find out what device it belonged to, and upload that information to the database all periodically throughout a gap of time. Over a couple of months, we finalized our data collection design to utilize crontab <ref> How to use Crontab - [https://opensource.com/article/17/11/how-use-cron-linux]</ref> , a program used the schedule execution of programs at certain times. Crontab was also utilized to setup emailing the Pi's IP address on boot, as detailed in a class tutorial. <ref> SSHing into your Pi Tutorial - [https://classes.engineering.wustl.edu/ese205/core/index.php?title=SSHing_into_your_Raspberry_Pi] </ref> Crontab was used to schedule two tasks: Running kismet and dumping the output into a text file, and running a script to parse through the text file and upload the necessary information. <br/>
 +
 
 +
The crontab usage can be seen below. The first program runs kismet through the timeout modifier <ref> Using timeout with crontab - [https://serverfault.com/questions/539665/setting-timeout-for-cron-jobs] </ref> such that it only runs for four minutes. All the contents of the output is written to the the kismetlog.txt
 +
 
 +
<source lang = "bash">
 +
 
 +
# m h  dom mon dow  command
 +
*/5 *  *  *  *    /usr/bin/timeout 240 /usr/local/bin/kismet > kismetlog.txt
 +
*/5 *  *  *  *    ./busybear2
 +
 
 +
</source>
 +
 
 +
The second task to run is an executable file named "busybear2" <ref> How to make a file executable - [https://medium.com/@peey/how-to-make-a-file-executable-in-linux-99f2070306b5] </ref> whose contents is shown below.
 +
 
 +
<source lang = "bash">
 +
sleep 242s
 +
python3 uploader.py
 +
</source>
 +
 
 +
This bash script's only purpose is to wait 4 minutes and 2 seconds (essentially waiting for the kismet task to terminate) before executing the uploader script. The contents of "uploader.py" can be seen below.
 +
 
 +
<source lang = "python">
 +
 
 +
# Regular Expression, only gets MAC Addresses after it sees "device"
 +
MAC_regex = re.compile(r"(?<=\bdevice\b\s)\w\w[:]\w\w[:]\w\w[:]\w\w[:]\w\w[:]\w\w")
 +
 
 +
# Loop through the lines of the file to find MAC Addresse
 +
for line in testFile:
 +
  MAC_addresses = MAC_regex.findall(line) # Compile all found mac addresses in var MAC_addresses
 +
  for address in MAC_addresses: # Loop through the individual MAC Addresses
 +
    req = requests.get(MAC_URL%address)
 +
    obj = req.json()
 +
    for key, value in obj.items():
 +
      if('company' in value):
 +
        values = (address,value['company'])
 +
      else:
 +
      values = (address,'Null')
 +
      cursor.execute(qString,values)
 +
 
 +
</source>
 +
 
 +
The uploader script uses regular expressions <ref> CSE 330 Wiki: Regular Expressions - [https://classes.engineering.wustl.edu/cse330/index.php/Regular_Expressions] </ref> <ref> Regular expressions look-ahead/behind - [https://www.regular-expressions.info/lookaround.html] </ref> <ref> Online Regular Expression tester - [https://regex101.com/] </ref> in order to isolate the desired MAC addresses from the text file constructed in the first crontab task. From there, it utilizes a MAC Address Vendor Lookup API <ref> MAC Address Vendor Lookup API - [https://macvendors.co/api/python] </ref> in order to attach a vendor name to a MAC Address. From there, the MAC Address and its associated vendor is uploaded to our mySQL database <ref> Uploading data to a mySQL database - [http://www.mysqltutorial.org/python-mysql-insert/] </ref> whereupon it is automatically assigned a timestamp and unique ID. More on how our database was created and structure in the next section. NOTE: Not all code is shown, full code can be accessed on our GitHub.
 +
 
 +
==Managing a Database==
 +
RDS database through AWS named BusyBear. Which we can access through MySQL workbench. From the database, we created multiple tables to story current MAC addresses and historical information. It was important for us that the table storing MAC addresses auto populate timestamps with the current time. As far as externally connecting the database to both the Pi and the website, we found that using a combination of python and PHP was the most effective. It is important to note, however, to keep the database secure we had to escape all incoming data to prevent SQL injection attacks. As of now, our database isn't in 3rd normal form however this is a goal we have to store and access data most efficiently. One hurdle was attempting to store pictures in the database (as LONGBLOB) but we discovered that because a) the complexity caused instability, and b) picture's large size made queries take an exceptional amount of time. It was best to just upload images directly to the website and bypass the database.<br>
 +
 
 +
=== Databases and Structures Used ===
 +
 
 +
Contained within our uploader.py program (see GitHub for full code), we are uploading ALL MAC Addresses to a single table. From there, we sort this information into more tables. The basic idea can be seen by the strings used to access/alter the mySQL table below:
 +
 
 +
<source lang = "python">
 +
qString = 'INSERT INTO wifiMAC_BD (macAdd, vendor) VALUES (%s, %s)'
 +
 
 +
</source>
 +
 
 +
The first qString demonstrates what happens first; we upload every MAC Address, regardless of what it was, into a database named wifiMAC_BD. We can see an example of the entries below. Notice all the MAC Addresses, the timestamp they were inserted, and the vendors that refer to each address. Null simply means our API did not have a registered vendor under that MAC Address.
 +
 
 +
<br/>
 +
[[File:WifiMacCreateString.png | 300px | center | WifiMacCreateString ]]
 +
[[File:live.png | 300px | center | wifiMAC_BD table ]]
 +
 
 +
<br />
 +
 
 +
<source lang = "python">
 +
qString2 = "SELECT count(*) FROM wifiMAC_BD where (timestampe > now() - interval '5' minute) AND ( vendor = 'Apple, Inc.' OR  vendor = 'Google, Inc.' Or  vendor ='Microsoft' Or  vendor ='Samsung Electronics Co.,Ltd' OR  vendor ='HUAWEI TECHNOLOGIES CO.,LTD');"
 +
qString3 = "INSERT INTO historicalData_BD_Limited (numAddresses,helper)  VALUES (%s,%s)"
 +
 
 +
</source>  
 +
 
 +
Next, using qString2, we query the database with all the entries and only record the number of entries from the past 5 minutes (our code runs every 5 minutes) and that have a valid vendor. Valid vendors are those that probably corresponds to devices used by students, like Apple, Samsung, Microsoft, etc. This filters out any noise like routers, printers, and many more random devices at any given time which gives consistency to our design. The next qString inserts this number into a separate table called historicalData_BD_Limited as seen in qString3. This table is used to create our charts located on our website. An example of what this table's entries looks like can be seen below. The important piece is that there is an entry representing the total number of filtered devices at a given 5 minute interval.
 +
 
 +
<br/>
 +
[[File:HistoricalDataLimitedCreateString.png | 300px | center | Historical Data Limited Create String]]
 +
[[File:historical.png | 300px | center | historicalData_BD_Limited table ]]
 +
 
 +
 
 +
<br/>
 +
 
 +
==Create a Website==
 +
 
 +
=== Website Design ===
 +
We initially imagined that our website would flow like this:<br>
 +
[[File:WebsiteFlowchart.png|400px|center|Website Design Flowchart]]<br>
 +
At the moment our plan is to create a website very similar to this, but with direct access to the available location from the home page. Our current website should function in the following:
 +
[[File:WebsiteDesign.png|400px|center|Website Design Flowchart]]<br>
 +
<br>  
 +
The Website is hosted by AWS and uses a LightSail instance running LAMPS (which comes preinstalled with apache2, PHP, and MySQL support) which made the setup significantly easier. Each page of the website is a PHP, CSS, HTML, and Javascript hybrid which uses PHP to access data from and to the database while HTML, CSS, and Javascript are used to display this information. Specifically, we use Javascript functions created by Google as well as code from https://canvasjs.com/html5-javascript-bar-chart/ to display the relevant graphs. To make the website more professional, we opted to buy (rent) a domain name for Amazon Route 53. This required us to rework the DNS preferences of the domain but ultimately was successful in connection to our Lightsail instance. Our final website can be found here: http://www.mybusybear.com
 +
 
 +
=== Creating Charts ===
 +
 
 +
Our measure of busyness is completely relative. That is, it relies on us having a lot of reliable data. We do so by having many entries of the number of devices in a given location over a long period of time. From there, we can find the maximum and minimum number of devices in the history of the location. This gives us a relative "empty" and "full" which we can then compare the current busyness. Similar thinking can be applied to the historical information (hour periods over a given day) to construct relative charts. Doing so actually gives us a good estimate of how busy Bear's Den is at a given point of time. Full code can be seen in our BusyBear_BD.php file located on our GitHub.
 +
 
 +
<br/>
 +
 
 +
To display live busyness, we went with a pie chart because it gives a really easy to see visualization of how free or busy the location is. A slice of a pie gives a very instantaneous and intuitive sense of how free the location is which gives useful insight on whether to go to the location or not. For historical busyness, we went with a bar graph over hour long times. This gives a very nice visual trend over time. You can see spikes or periods of high busyness that make logical sense.
 +
 
 +
<br/>
 +
 
 +
Below we can see a brief snippet of the start of the code to get the maxes and mins so that you can locate it on our Git. Examples of the charts can also be seen in the Results tab.
 +
 
 +
<source lang = "php">
 +
 
 +
$sql2 = "SELECT Max(numAddresses) AS max FROM historicalData_BD_Limited";
 +
$sql3 = "SELECT Min(numAddresses) AS min FROM historicalData_BD_Limited";
 +
$sql4 = "SELECT numAddresses AS current FROM historicalData_BD_Limited where timestampe > now() - interval '5' minute";
 +
 
 +
</source>
 +
 
 +
=== Mobile Display ===
 +
 
 +
If you're viewing this after the Spring semester 2019, we've taken down the website as to not incur charges. You can always recreate the website using our code on GitHub, but below are a few screenshots taken from a phone of the website to give you a better idea how it looks.
 +
 
 +
<br/>
 +
[[File:Website_Home.jpeg |  250px]]
 +
[[File:Website_Live_Chart.jpeg | 250px]]
 +
[[File:Website_Historical_Chart.jpeg |  250px]]
 +
<br/>
  
 
=Results=
 
=Results=
 +
Data collection went very well and was generally reflective of busyness. We hit our goals almost exactly, except for trying to extend this project to multiple locations so that people could reliably tell where to go to eat or study. We wanted to capture information for a week in one location at a constant time to construct good historical data and charts for at least one location. Since we were limited to one device, we could not record multiple locations at once. Additionally, since we did not have permission by any university staff and no feasible way to hide the device while taking pictures, we needed to sit with the device. The risk of having the Pi stolen or something going wrong was too high.
 +
<br/>
 +
<br/>
 +
For privacy reasons we will not show any of the pictures recorded, but we did self-analyze the approximate busyness every 30 minutes or so while recording data to ensure accuracy. The pictures along with passively intercepting packets fell under an ethical dilemma, but in the end we felt like we were not doing that any other device/person could accomplish. However, most apps are required to tell you what information they are collecting before you download them and there are probably some actual ethical considerations to be made. For the scope of the project, no user data was being used maliciously, and there was basically no way to be able to correlate a MAC Address to a specific person.
 +
<br/>
 +
<br/>
 +
Shown below are examples of live data from 5:10pm and 6:45pm on Tuesday, 4/16/19 and an example of our historical data from 5-11pm. As you can see, at 5:10pm it was not as busy, which reflected the real life scenario with only a few booths being occupied with people working. Later at 6:45pm you can see busyness ramp up a lot, as it's peak hour for people coming in and out to eat dinner.
 +
<br/>
 +
[[File:LiveFree.jpg|500px|center|thumb| BD at 5:10pm on Tuesday, 04/16/19]]
 +
<br/>
 +
[[File:LiveBusy.jpg|500px|center|thumb| BD at 6:45pm on Tuesday, 04/16/19]]
 +
<br/>
 +
 +
Regarding the historical data, we can see a clear trend surrounding dinner time. BD starts quiet as BD just starts to open and peaks from around 6-7pm before slowing dying down throughout the night. This might vary day to day, but this is the overall trend we observed. Live data is more reflective of the real time busyness, whereas the historical data reflects big trends.
 +
 +
[[File:HistoricalData.jpg|500px|center|thumb | Historical Trends based off the week of 04/15/19]]
 +
<br/>
 +
 +
[[File:FinalPosterBusyBear.png|750px|center]]
  
 
=Next Steps=
 
=Next Steps=
 +
There are several areas where we can improve upon or explore.
 +
# Increasing capacity in which the camera is used: Currently, the pictures captured by the Pi are strictly for reference only. It would be a more robust system if some sort of human detection was implemented and automatically analyzed to refine our busyness level.
 +
# Privacy and security concerns: Obviously nothing strictly illegal was done, however, that does not mean there are no worries. While sniffing packets is done regularly by numerous devices, apps, etc. there is still a concern of proper ethical considerations into tracking and monitoring people. Additionally, pictures being taken and stored could potentially capture damaging information. All of this paired with the fact we did not encrypt any information is certainly a front in which we would strengthen.
 +
# Make a more compact body and design for the actual finished product. Right now the case and network adapter are very clunky to work with. We settled on the first network that worked, but in reality we could probably downgrade the network adapter and Raspberry Pi to be more cost effective and smaller in overall size.
 +
# Expand our system such that it works at multiple locations. This means more devices, more mySQL tables, and more website tabs. In a world where cost isn't a huge issue, the beauty of our design is that we replicate the same process in every location we would want, the only limitation being where/how we can stow away the device itself.
 +
## A minor change to our database structure would probably be ideal. Instead of having to have a separate table for ALL MAC addresses for each location, we could have one big with an added column/entry that was for locationID. This would allow us to query only the main database for each location's data analysis.
  
 
=References=
 
=References=
Past Projects:
+
===Not Quoted===
 +
Past Projects
 
* [[https://www.crc.id.au/tracking-people-via-wifi-even-when-not-connected/ Tracking People/MAC]]
 
* [[https://www.crc.id.au/tracking-people-via-wifi-even-when-not-connected/ Tracking People/MAC]]
 
* [[https://www.raspberrypi.org/forums/viewtopic.php?t=47059 Pi Wifi/Bluetooth]]
 
* [[https://www.raspberrypi.org/forums/viewtopic.php?t=47059 Pi Wifi/Bluetooth]]
* [[https://makezine.com/projects/build-raspberry-pi-network-scanner/ Pi Network]]
 
 
* [[https://zone13.io/post/wifi-monitoring-using-raspberry-pi/ Pi WiFi Monitor]]
 
* [[https://zone13.io/post/wifi-monitoring-using-raspberry-pi/ Pi WiFi Monitor]]
  
Pi Camera:
+
Pi Blinking LED (tutorial sake)
* [[https://www.youtube.com/watch?v=T8T6S5eFpqE Pi Camera]]
 
 
 
Pi Blinking LED:
 
 
* [[https://raspberrypihq.com/making-a-led-blink-using-the-raspberry-pi-and-python/ Blinking LED]]
 
* [[https://raspberrypihq.com/making-a-led-blink-using-the-raspberry-pi-and-python/ Blinking LED]]
  
nmap:
+
nmap (unused in the end)
 
* [[https://highon.coffee/blog/nmap-cheat-sheet/ Commands]]
 
* [[https://highon.coffee/blog/nmap-cheat-sheet/ Commands]]
  
fping:
+
fping (unused in the end)
 
* [[http://www.linuxscrew.com/2007/09/17/ping-range-of-ip-addresses-in-parallel-with-fping/ Example]]
 
* [[http://www.linuxscrew.com/2007/09/17/ping-range-of-ip-addresses-in-parallel-with-fping/ Example]]
  
openCV:
+
openCV (unused in the end)
 
* [[https://www.learnopencv.com/install-opencv-4-on-raspberry-pi/ Installation tutorial]]
 
* [[https://www.learnopencv.com/install-opencv-4-on-raspberry-pi/ Installation tutorial]]
  
kismet & monitoring mode:
+
kismet & monitoring mode (referenced in our tutorial)
 
* [[https://medium.com/@aallan/adding-a-second-wireless-adaptor-to-a-raspberry-pi-for-network-monitoring-c37d7db7a9bd kismet & enabling monitor mode]]  
 
* [[https://medium.com/@aallan/adding-a-second-wireless-adaptor-to-a-raspberry-pi-for-network-monitoring-c37d7db7a9bd kismet & enabling monitor mode]]  
  
Regex/Dictionary/API:
+
Regex/Dictionary/API
* [[https://macvendors.co/api/python MAC Address Vendor Lookup API]]
+
* [[https://www.w3schools.com/python/python_dictionaries.asp Python Dictionary]] (unused in the end)
* [[https://www.w3schools.com/python/python_dictionaries.asp Python Dictionary]]
+
* [[https://snakify.org/en/lessons/dictionaries_dicts/ More Python Dictionary Documentation]] (unused in the end)
* [[https://snakify.org/en/lessons/dictionaries_dicts/ More Python Dictionary Documentation]]
+
* [[https://www.wireshark.org/tools/oui-lookup.html Wireshark OUI lookup]] (API used instead)
* [[https://classes.engineering.wustl.edu/cse330/index.php/Regular_Expressions CSE330 Regular Expressions Wiki]]
 
* [[https://www.regular-expressions.info/lookaround.html Lookahead/Lookbehind Regular Expression Help]]
 
* [[https://regex101.com/ Regular Expression Online tester]]
 
* [[https://www.wireshark.org/tools/oui-lookup.html Wireshark OUI lookup]]
 
  
 +
===Quoted===
 
[[Category:Projects]]
 
[[Category:Projects]]
 
[[Category:Spring 2019 Projects]]
 
[[Category:Spring 2019 Projects]]

Latest revision as of 11:44, 2 May 2019

Project Proposal

Overview

It always seems like an impossible task to find an open table to work or a quick line for food across the WashU campus. BusyBear's goal is to create a database that is accessible to WashU students that will show the population and busyness trends of popular locations on campus, beginning with Bear's Den. By using a network adapter connected to the Raspberry Pi, we will receive an approximate measurement of busyness based on the number of found MAC addresses for a specific region. By looking at pictures taken simultaneously with the MAC address collection, a historic trend between the number of found MAC addresses and relative busyness can be determined. We hope to be able to store this information in a database hosted by AWS and display this data on a website. Our end goal is to gather information to allow the WashU community to create more educated decisions regarding where to go and when to go there.

Team Members

Thomas Emerson
Tom Goon
Allison Todd
David Tekien, TA
Jim Feher, Instructor

Links

[Project Log]
[Project Presentation]
[GitHub Repository]
[Network Adapter Monitoring Mode Tutorial]
[Poster]

Objectives

  • Learn and be able to code in Python as related to the Pi
  • Use sniffing/MAC tracking method in the analysis of busyness
  • Investigate the use of the camera in the analysis of busyness
  • Be able to monitor busyness fairly accurately by filtering detected devices
  • Compare busyness at different times of day and between buildings
  • Design a GUI for an aesthetically pleasing and useful website
  • Host a website displaying useful and relevant data through Amazon Web Services (AWS)

Challenges

  • Limited experience with working with WiFi receivers or anything to do with MAC Addresses
  • Limited knowledge of Python and Raspberry Pi
  • Connecting our data with a database, AWS, and a website
  • Privacy Concerns

Gantt Chart

GanttChart 1.png

Budget

Item Description Cost Link
AWS Website Hosting $5 / month https://aws.amazon.com/pricing/?nc2=h_ql_pr
2 x TL-WN722N Network Adapter returned: $7.21 https://www.amazon.com/TP-Link-TL-WN722N-Wireless-network-Adapter/dp/B002SZEOLG
1 x 5dBi Long Range WiFi for Raspberry Pi Network Adapter returned: $5.00 https://www.amazon.com/5dBi-Long-Range-WiFi-Raspberry/dp/B00YI0AIRS/ref=lp_9026119011_1_1?srs=9026119011&ie=UTF8&qid=1550447401&sr=8-1
1 x Alfa AWUSO36NH High Gain USB Wireless G/N Long-Range WiFi Network Adapter Network Adapter $31.99 https://www.amazon.com/Alfa-AWUSO36NH-Wireless-Long-Rang-Network/dp/B0035APGP6/ref=sr_1_1_sspa?keywords=alfa+network+adapter&qid=1553045771&s=gateway&sr=8-1-spons&psc=1
mybusybear.com Domain Name $12.00 DomainPrice.jpg
Total Cost $71.20

Design and Solutions

Basic Idea
http://www.mybusybear.com

Build the Device

We began by constructing a device to collect MAC addresses. Initially, we hoped that with the RaspberryPi's WiFi capabilities, we could simply use the base hardware for detection. We quickly determined that the RaspberryPi was not capable of entering a monitoring mode[1]; we would need external hardware to serve this purpose. We went through a variety of external network adapters, and ultimately found one with both monitoring mode capabilities and compatibility with the RaspberryPi [2]. Using the Network Adapter's functionality will be explored further in the Collect Information section and in the Network Adapter in Monitoring Mode tutorial[3].
We decided that a RaspberryPi camera should be added to the device to strengthen the validity of the data gathered from the network adapter. The Pi Camera is fairly simple to connect and the functionality is implemented through Pi commands[4]. By analyzing a combination of the number of addresses collected and the visual busyness found in the picture, more accurate trends over time can be determined.
This is our final project:
Device.jpeg

Collect Information

In the before mentioned tutorial [Network Adapter Monitoring Mode Tutorial] we established how setup the network adapter in monitoring mode and install kismet, the software we used to utilize monitoring mode. Once properly configured, simply calling kismet spews out text into the console as so:

INFO: 802.11 Wi-Fi device 00:A7:42:FC:6E:03 advertising SSID
      'wustl-guest-2.0'
INFO: Detected new 802.11 Wi-Fi access point 00:A7:42:FC:6E:00
INFO: 802.11 Wi-Fi device 00:A7:42:FC:6E:00 advertising SSID 'eduroam'
INFO: Detected new 802.11 Wi-Fi device 74:B5:87:C6:90:1E
INFO: Detected new 802.11 Wi-Fi device 00:08:E3:FF:FD:EC

We could collect data, but we needed a way to be able to consolidate the MAC Addresses, find out what device it belonged to, and upload that information to the database all periodically throughout a gap of time. Over a couple of months, we finalized our data collection design to utilize crontab [5] , a program used the schedule execution of programs at certain times. Crontab was also utilized to setup emailing the Pi's IP address on boot, as detailed in a class tutorial. [6] Crontab was used to schedule two tasks: Running kismet and dumping the output into a text file, and running a script to parse through the text file and upload the necessary information.

The crontab usage can be seen below. The first program runs kismet through the timeout modifier [7] such that it only runs for four minutes. All the contents of the output is written to the the kismetlog.txt

# m h  dom mon dow   command
*/5 *  *   *   *    /usr/bin/timeout 240 /usr/local/bin/kismet > kismetlog.txt
*/5 *  *   *   *    ./busybear2

The second task to run is an executable file named "busybear2" [8] whose contents is shown below.

sleep 242s
python3 uploader.py

This bash script's only purpose is to wait 4 minutes and 2 seconds (essentially waiting for the kismet task to terminate) before executing the uploader script. The contents of "uploader.py" can be seen below.

# Regular Expression, only gets MAC Addresses after it sees "device"
MAC_regex = re.compile(r"(?<=\bdevice\b\s)\w\w[:]\w\w[:]\w\w[:]\w\w[:]\w\w[:]\w\w")

# Loop through the lines of the file to find MAC Addresse
for line in testFile:
  MAC_addresses = MAC_regex.findall(line) # Compile all found mac addresses in var MAC_addresses
  for address in MAC_addresses: # Loop through the individual MAC Addresses
    req = requests.get(MAC_URL%address)
    obj = req.json()
    for key, value in obj.items():
      if('company' in value):
        values = (address,value['company'])
      else:
      	values = (address,'Null')
      cursor.execute(qString,values)

The uploader script uses regular expressions [9] [10] [11] in order to isolate the desired MAC addresses from the text file constructed in the first crontab task. From there, it utilizes a MAC Address Vendor Lookup API [12] in order to attach a vendor name to a MAC Address. From there, the MAC Address and its associated vendor is uploaded to our mySQL database [13] whereupon it is automatically assigned a timestamp and unique ID. More on how our database was created and structure in the next section. NOTE: Not all code is shown, full code can be accessed on our GitHub.

Managing a Database

RDS database through AWS named BusyBear. Which we can access through MySQL workbench. From the database, we created multiple tables to story current MAC addresses and historical information. It was important for us that the table storing MAC addresses auto populate timestamps with the current time. As far as externally connecting the database to both the Pi and the website, we found that using a combination of python and PHP was the most effective. It is important to note, however, to keep the database secure we had to escape all incoming data to prevent SQL injection attacks. As of now, our database isn't in 3rd normal form however this is a goal we have to store and access data most efficiently. One hurdle was attempting to store pictures in the database (as LONGBLOB) but we discovered that because a) the complexity caused instability, and b) picture's large size made queries take an exceptional amount of time. It was best to just upload images directly to the website and bypass the database.

Databases and Structures Used

Contained within our uploader.py program (see GitHub for full code), we are uploading ALL MAC Addresses to a single table. From there, we sort this information into more tables. The basic idea can be seen by the strings used to access/alter the mySQL table below:

qString = 'INSERT INTO wifiMAC_BD (macAdd, vendor) VALUES (%s, %s)'

The first qString demonstrates what happens first; we upload every MAC Address, regardless of what it was, into a database named wifiMAC_BD. We can see an example of the entries below. Notice all the MAC Addresses, the timestamp they were inserted, and the vendors that refer to each address. Null simply means our API did not have a registered vendor under that MAC Address.


WifiMacCreateString
wifiMAC_BD table


qString2 = "SELECT count(*) FROM wifiMAC_BD where (timestampe > now() - interval '5' minute) AND ( vendor = 'Apple, Inc.' OR  vendor = 'Google, Inc.' Or  vendor ='Microsoft' Or  vendor ='Samsung Electronics Co.,Ltd' OR  vendor ='HUAWEI TECHNOLOGIES CO.,LTD');" 
qString3 = "INSERT INTO historicalData_BD_Limited (numAddresses,helper)  VALUES (%s,%s)"

Next, using qString2, we query the database with all the entries and only record the number of entries from the past 5 minutes (our code runs every 5 minutes) and that have a valid vendor. Valid vendors are those that probably corresponds to devices used by students, like Apple, Samsung, Microsoft, etc. This filters out any noise like routers, printers, and many more random devices at any given time which gives consistency to our design. The next qString inserts this number into a separate table called historicalData_BD_Limited as seen in qString3. This table is used to create our charts located on our website. An example of what this table's entries looks like can be seen below. The important piece is that there is an entry representing the total number of filtered devices at a given 5 minute interval.


Historical Data Limited Create String
historicalData_BD_Limited table



Create a Website

Website Design

We initially imagined that our website would flow like this:

Website Design Flowchart


At the moment our plan is to create a website very similar to this, but with direct access to the available location from the home page. Our current website should function in the following:

Website Design Flowchart



The Website is hosted by AWS and uses a LightSail instance running LAMPS (which comes preinstalled with apache2, PHP, and MySQL support) which made the setup significantly easier. Each page of the website is a PHP, CSS, HTML, and Javascript hybrid which uses PHP to access data from and to the database while HTML, CSS, and Javascript are used to display this information. Specifically, we use Javascript functions created by Google as well as code from https://canvasjs.com/html5-javascript-bar-chart/ to display the relevant graphs. To make the website more professional, we opted to buy (rent) a domain name for Amazon Route 53. This required us to rework the DNS preferences of the domain but ultimately was successful in connection to our Lightsail instance. Our final website can be found here: http://www.mybusybear.com

Creating Charts

Our measure of busyness is completely relative. That is, it relies on us having a lot of reliable data. We do so by having many entries of the number of devices in a given location over a long period of time. From there, we can find the maximum and minimum number of devices in the history of the location. This gives us a relative "empty" and "full" which we can then compare the current busyness. Similar thinking can be applied to the historical information (hour periods over a given day) to construct relative charts. Doing so actually gives us a good estimate of how busy Bear's Den is at a given point of time. Full code can be seen in our BusyBear_BD.php file located on our GitHub.


To display live busyness, we went with a pie chart because it gives a really easy to see visualization of how free or busy the location is. A slice of a pie gives a very instantaneous and intuitive sense of how free the location is which gives useful insight on whether to go to the location or not. For historical busyness, we went with a bar graph over hour long times. This gives a very nice visual trend over time. You can see spikes or periods of high busyness that make logical sense.


Below we can see a brief snippet of the start of the code to get the maxes and mins so that you can locate it on our Git. Examples of the charts can also be seen in the Results tab.

$sql2 = "SELECT Max(numAddresses) AS max FROM historicalData_BD_Limited";
$sql3 = "SELECT Min(numAddresses) AS min FROM historicalData_BD_Limited";
$sql4 = "SELECT numAddresses AS current FROM historicalData_BD_Limited where timestampe > now() - interval '5' minute";

Mobile Display

If you're viewing this after the Spring semester 2019, we've taken down the website as to not incur charges. You can always recreate the website using our code on GitHub, but below are a few screenshots taken from a phone of the website to give you a better idea how it looks.


Website Home.jpeg Website Live Chart.jpeg Website Historical Chart.jpeg

Results

Data collection went very well and was generally reflective of busyness. We hit our goals almost exactly, except for trying to extend this project to multiple locations so that people could reliably tell where to go to eat or study. We wanted to capture information for a week in one location at a constant time to construct good historical data and charts for at least one location. Since we were limited to one device, we could not record multiple locations at once. Additionally, since we did not have permission by any university staff and no feasible way to hide the device while taking pictures, we needed to sit with the device. The risk of having the Pi stolen or something going wrong was too high.

For privacy reasons we will not show any of the pictures recorded, but we did self-analyze the approximate busyness every 30 minutes or so while recording data to ensure accuracy. The pictures along with passively intercepting packets fell under an ethical dilemma, but in the end we felt like we were not doing that any other device/person could accomplish. However, most apps are required to tell you what information they are collecting before you download them and there are probably some actual ethical considerations to be made. For the scope of the project, no user data was being used maliciously, and there was basically no way to be able to correlate a MAC Address to a specific person.

Shown below are examples of live data from 5:10pm and 6:45pm on Tuesday, 4/16/19 and an example of our historical data from 5-11pm. As you can see, at 5:10pm it was not as busy, which reflected the real life scenario with only a few booths being occupied with people working. Later at 6:45pm you can see busyness ramp up a lot, as it's peak hour for people coming in and out to eat dinner.

BD at 5:10pm on Tuesday, 04/16/19


BD at 6:45pm on Tuesday, 04/16/19


Regarding the historical data, we can see a clear trend surrounding dinner time. BD starts quiet as BD just starts to open and peaks from around 6-7pm before slowing dying down throughout the night. This might vary day to day, but this is the overall trend we observed. Live data is more reflective of the real time busyness, whereas the historical data reflects big trends.

Historical Trends based off the week of 04/15/19


FinalPosterBusyBear.png

Next Steps

There are several areas where we can improve upon or explore.

  1. Increasing capacity in which the camera is used: Currently, the pictures captured by the Pi are strictly for reference only. It would be a more robust system if some sort of human detection was implemented and automatically analyzed to refine our busyness level.
  2. Privacy and security concerns: Obviously nothing strictly illegal was done, however, that does not mean there are no worries. While sniffing packets is done regularly by numerous devices, apps, etc. there is still a concern of proper ethical considerations into tracking and monitoring people. Additionally, pictures being taken and stored could potentially capture damaging information. All of this paired with the fact we did not encrypt any information is certainly a front in which we would strengthen.
  3. Make a more compact body and design for the actual finished product. Right now the case and network adapter are very clunky to work with. We settled on the first network that worked, but in reality we could probably downgrade the network adapter and Raspberry Pi to be more cost effective and smaller in overall size.
  4. Expand our system such that it works at multiple locations. This means more devices, more mySQL tables, and more website tabs. In a world where cost isn't a huge issue, the beauty of our design is that we replicate the same process in every location we would want, the only limitation being where/how we can stow away the device itself.
    1. A minor change to our database structure would probably be ideal. Instead of having to have a separate table for ALL MAC addresses for each location, we could have one big with an added column/entry that was for locationID. This would allow us to query only the main database for each location's data analysis.

References

Not Quoted

Past Projects

Pi Blinking LED (tutorial sake)

nmap (unused in the end)

fping (unused in the end)

openCV (unused in the end)

kismet & monitoring mode (referenced in our tutorial)

Regex/Dictionary/API

Quoted

  1. Pi Network - [1]
  2. Network Adapter - [2]
  3. Network Adapter in Monitoring Mode Tutorial - [3]
  4. Pi Camera - [4]
  5. How to use Crontab - [5]
  6. SSHing into your Pi Tutorial - [6]
  7. Using timeout with crontab - [7]
  8. How to make a file executable - [8]
  9. CSE 330 Wiki: Regular Expressions - [9]
  10. Regular expressions look-ahead/behind - [10]
  11. Online Regular Expression tester - [11]
  12. MAC Address Vendor Lookup API - [12]
  13. Uploading data to a mySQL database - [13]