Difference between revisions of "Laser Harp"

From ESE205 Wiki
Jump to: navigation, search
(Design and Solutions)
(Playing different notes concurrently)
Line 81: Line 81:
 
Pygame.mixer.Channel() and pygame.mixer.sound were implemented within the code so that multiple notes could be played at once. Using this method, different sounds are projected to different channels where they can be played independently without interfering with the others.  
 
Pygame.mixer.Channel() and pygame.mixer.sound were implemented within the code so that multiple notes could be played at once. Using this method, different sounds are projected to different channels where they can be played independently without interfering with the others.  
 
<gallery>
 
<gallery>
diagramofcode.JPG|Diagram of how code works
+
diagramofcode.jpg|Diagram of how code works
 
</gallery>
 
</gallery>
  

Revision as of 10:28, 25 April 2019

Links

Weekly log: Laser_Harp_Weekly_Log

How to tutorial: Concatenating_wave_files

Git hub: Git hub

Presentation slide show: File:Laser Harp.pdf

Poster:

Overview

Although instruments have come a long way from their origins, they still have room to grow. Inspired by the transition from acoustic to electronic instruments, the laser harp strives to introduce a new way of experiencing music. With the laser harp, one would be able to enjoy musical scales by touching rays of light. Since the harp will have different keys and scales, it is important that the system's programming is properly carried out so the user can easily operate the instrument. Moreover, the sensors must be properly installed and synced with the disturbance of the lasers' trajectories for the project to be successful.

Team Members

  • Taylor Howard
  • Jennifer Fleites
  • Yoojin Kim
  • TA: Chance Bayles
  • Instructor: Jim Feher

Objectives

  • Learn how to use Raspberry pi and python.
  • Build a circuit connecting the laser diodes, photo-resistors, and LEDs to the Raspberry Pi.
  • Build a frame for the harp through woodworking.
  • Determine which notes or sounds are feasible based on execution of code.
  • Create code to work the Raspberry Pi (this includes code for determining when a note is played,turning on LEDs and playing sounds as notes are played, increasing volume as note is held, playing back a composition that the player wishes to record, and uploading that composition to a AWS server).

Challenges

Software

  • Writing and understanding code that is executed in the different cases that occur when a laser is tripped

Hardware

  • Reliably alligning lasers to photoresistors
  • Cord management

Budget

Supplied

  • Buttons
  • Wood
  • Digital-to-analog converter
  • Wires
  • Raspberry pi

Purchased

Tax: $3.65

TOTAL Purchased: $18.45

Gantt Chart

LHGantt2.png

Proposal Presentation


Design and Solutions

Programming

Determining when a laser beam breaks

The amount of light that the photo-resistor senses is converted to an analog value using an analog digital converter. The analog value is then converted to a voltage value and then to a temperature value. Using this temperature value, it can be determined whether or not the laser beam has been triggered by seeing if the value lies within a certain range. If a beam has been triggered, the sound associated with that beam is played.

Playing different notes concurrently

Pygame.mixer.Channel() and pygame.mixer.sound were implemented within the code so that multiple notes could be played at once. Using this method, different sounds are projected to different channels where they can be played independently without interfering with the others.

Recording the composition

The composition is created by recording the audio output. Using an app, the system records what is sent to the headphones and when the user ends the recording the composition is sent to an AWS server where the user can access it from the device they wish. If the user wishes to record their composition, the record button is pressed and as they play each note or collection of notes, the produced sounds are added to the recording. At the end there is a finalized recording that is then uploaded to an AWS server where the user can access the file with another device.

Uploading the composition to a AWS server

Using AWS Lightsail, a cheap and easy hosting server, we first created a local host to experiment with uploading files to different folders. Once we succeeded in creating a code that would request the server to find a file and upload it to a different folder, we then transferred to codes to an online host by connecting to an IP address associated with our AWS account.


Building

The case was sketched using Solidworks, where proper measurements were made regarding the lengths and angles necessary for cutting the plywood used for the final design. After the wood was cut, the parts were drilled together and the proper wiring was enforced. In wiring the lasers and photo-resistors to the RaspberryPi, the lasers were connected to the power supply provided by the pi so that they can turn on and the photo-resistors are connected to the analog digital converter so that the amount of light hitting them can be read and used to execute the code programmed into the pi.

Results

How Results Compare to Original Objectives

For the most part, all objectives were met; the case was sketched and created, the wireframe was properly executed, and the code was able to handle multiple notes being played at once. The objectives that were not met include: altering the volume as the user holds a note, a switch that allows the user to decide which scale they wish to play with, and LEDs that blink when a note is played.

Limitations that affected the result

Throughout the project, the main factor that delayed progress was figuring out how to play notes concurrently. Due to this delay, the recording of the composition and the altering of volume became harder to implement.

Playing Notes Concurrently

This part of the project took the most iterations. In getting multiple notes to play at once, pydub,pygame.audio, pygame.mixer.music, multiprocessing, multithreading, and swmixer were attempted before applying pygame.mixer.Channel() and pygame.mixer.sound. Pydub and pygame.audio were somewhat successful in playing notes individually. Due to the run time needed to execute the lines of code responsible for playing the note, there was a lag present in the sound produced, making these options less viable. To fix the lag, pygame.mixer.music was applied. However, even though the lag was resolved, only one note could be played at a time. It was also found that pygame.mixer.sound was more beneficial in supporting multiple sounds being played so it was used in place of pygame.mixer.music when finalizing the code.

Working towards the goal of playing notes concurrently, multiprocessing and multithreading were attempted. Multiprocessing was attempted first since it would make sense for the sounds to play completely independent of each other. However, when executing the code, it was observed that multiprocessing would not produce the sounds being commanded since the sounds were overwriting each other. Multithreading was then used in lue of multiprocessing. Doing this allowed notes to be played individually. In order to get notes to be played concurrently, swmixer was used within the functions. Although notes were able to play concurrently, the code was still not functioning since notes continued playing after removing one’s finger and the sound that was produced was not clean.

Code using multiprocessing Code using multithreading and swmixer

After looking at the pygame library in more depth, pygame.mixer.Channel() was discovered. Implementing pygame.mixer.Channel() allowed multiple sounds to be played at once by projecting each sound to a new channel instead of forcing them to play in the same one, as was the case with multithreading. Implementing pygame.mixer.Channel() and pygame.mixer.sound within if statements corresponding to each case where a beam is triggered, allowed for the proper execution of code.

Code using channels

Recording the Composition

In keeping track of the notes being played, the composition was initially recorded by concatenating files as they were played. However, as different methods were implemented so that multiple notes could be played concurrently, the idea of concatenation became more complex. Therefore, after pygame.mixer.Channel() was implemented, the composition was created using another method that was not concatenation. The composition is created by recording the audio output. Using an app, the system records what is being sent to headphones and when the user ends the recording the composition is sent to an AWS server where the user can access it from the device they wish.

Next Steps

  • Having a switch that would allow user to choose between different scales.
  • Applying a change of volume as user holds a note.
  • Having LEDs that light up when a note is played.
  • Having multiple buttons were recordings could be saved to and replayed as they are touched.

References

https://www.instructables.com/id/Arduino-Laser-Harp-1/

https://www.instructables.com/id/Quick-Arduino-MIDI-Laser-Harp/

https://hackaday.io/project/28159-laser-harp-cnc-pi-zero

https://projects.raspberrypi.org/en/projects/gpio-music-box

Libraries

http://pydub.com

https://www.pygame.org/news

Proposal Presentation