CMU 60-223 (Fall 2016) & 48-390 (Spring 2017)
Musical Turn Wheel
This project was completed in the Fall (2016) for Introduction to Physical Computing. I partnered with an architecture major to create a musical turntable, using an Arduino UNO, that children got to test out in the Pittsburgh Children's Museum. The theme of this project was 'Making Things Magic.' The goal was to use physical computing in order to evoke wonder and delight in children.We tried to demonstrate a correlation between visual notation and aural structure with notched dials. This project is a wheel with 3 notched dials, which makes 3 different tones. Each individual dial along with the whole wheel can be adjusted to make music with different rhythms and tempos.
Implementation - The objective was simplicity. There was an emphasis on the rotary motion that would output noises based on the motion. In order to achieve this, we created a singular turn wheel with three dials. Each dial emits a different tone, which is triggered by photoresistors sensing light through the notches. The dials can be rotated individually and based on its orientation relative to each other, the musical structure changes. The wheel was pitched so that it is more approachable to children of different heights and is facing out so that children across room can notice it.
Outcomes - We thought that the interface was successful in inciting a sense of wonder in children at varying degrees. Some children would be engrossed in the dials and their positioning beyond 5 minutes while others were interested in the overall rotational motion, testing the spinning to its limits. Overall, they understood that there is a correlation between the noise emitted and rotational movement. What could have been further developed were the notch distribution and the possibility for the notches to be varied by the user, so that it has the potential to be an instrument.
Object Tracing Robot
A project completed in the Spring (2017) for Physical Computing Studio. I partnered with a cognitive science major to create a robot that traces objects on top of a white board table, using an Arduino UNO.The objective of this project was to make a wheeled robot that completes a challenge on a whiteboard table. Our personal goal of the project was to trace objects placed on a whiteboard table. We tried to achieve this by using an Arduino UNO, a standard push button switch and various motors.
Implementation - The main idea behind our design was that the robot would continually go straight until the push button switch hit an object. Once the button was pushed, the left wheel would start slowing down as the right wheel sped up and the servo motor would move the whiteboard marker to be perpendicular to the board, allowing it to trace the object. The purpose of the right wheel speeding up was so that the robot would continually go left allowing the object to be traced. I designed the encasing for the hardware using Rhinoceros 3D and laser cut the encasing from acrylic.
Outocmes - In the end, the robot did not successfully trace any objects. The robot just continually went one speed in one direction and the push button switch did not work. While completing the project, my partner and I had communication issues and we split up the work too abruptly. I mainly had to do with the physical design and assembly of the robot and my partner handled the hardware and code. We also did not fully utilize and portion our time well, which ultimately led to our downfall. Personally, I learned that communication is very important. While working with another partner, I should be more open about offering my help and splitting up the work in a more diverse manner.
Rain Notifying Umbrella
A project completed in the Spring (2017) for Physical Computing Studio. I created an umbrella handle that would light up if it is raining, using a Raspberry Pi. The theme of this project was to turn an ordinary device into something connected to the Internet of Things (IoT). My personal goal was to create an umbrella that lights up when it is going to rain.This project consists of multiple strings of LEDs that are connected to a Raspberry Pi. The LEDs were then encased in an acrylic umbrella handle.
Implementation - I started the process by using an API from Underground to obtain the current precipitation in inches. I cut off the handle of a normal umbrella and modeled a similar handle in Rhinoceros. I laser cut the umbrella handle, then stacked and glued the sheets of acrylic together to create a clear umbrella handle. I coded the Raspberry Pi to constantly get weather updates and light the sets of 3 LEDs depending on the amount of precipitation that is falling from the sky. If all three sets of LEDs are lit, then it is raining more than 1 inch. If two sets of LEDs are lit, it is raining between 0.5 inches and 1 inch. If one set of LEDs are lit, it is raining below 0.5 inches. If no LEDs are lit, then it is not raining.
Outcomes - In the end, the umbrella handle worked properly. The only thing that really failed was that it was not raining on the day of my critique so I had to force the demonstration. I think I could have made it better and cleaner, maybe using a different material for the physical umbrella handle. Due to the placement of the Raspberry Pi and the hardware, I could not physically close the umbrella. Though, I do think it was successful in completing the base objective.
3D Tone Generator
A project completed in the Spring (2017) for Physical Computing Studio. I created an augmented virtual world that played a specific tone when a specific box was selected by the user, using an Arduino UNO.The goal of the project was to create a custom Google Cardboard that fits our personal phone, create a Node.js server that was able to execute actuation through a Arduino or Raspberry Pi and to create a Three.js virtual reality world that could communicate with the server. An Arduino and speaker were used to create sound when a box was highlighted on in the virtual world.
Implementation - I learned how to actuate a response from the 3-D world to the outside world by using Johnny-Five. I created my own server using Node.js and utilized three.js to code my virtual world. I modeled the cube matrix using Rhinoceros. I exported and imported over 300 cubes into my virtual world. Each cube was then coded to correspond to a certain tone that would be played using the Arduino. I laser cut the Google Cardboard and modeled a box to hold the speaker and Arduino.
Outcomes - I think the overall project was an overall success. I would have liked to generate an algorithm that generated a matrix of cubes, corresponding them to a sound instead of hardcoding each of them, but due to my lack of time I failed to do so. I also think that I could have made the virtual world more interactive. After around 3 minutes, users got bored of just selecting different cubes. To further my project, different sounds could be incorporated rather than monotonous 1 second tones.
A project completed in the Spring semester (2017) for Physical Computing Studio. I created a study of sound in relation to motion, using an Arduino UNO. The objective of this project was to create a physical and interactive object or experience.I've always been interested in the actuation of sound with interaction, which is why I did this type of study. I tend to gravitate towards using sound, which is why a lot of my projects incorporate speakers in one way or another.
Implementation - I utilized an Arduino UNO, two sonar sensors and two speakers. Each sonar sensor implemented sound on a speaker. I hardcoded a range of distances to correlate to a range of frequencies. I designed the physical encasing to be a table-top object due to the size of the speakers and sensors. The speakers are elevated so that they are some-what at eye level. The users hands are supposed to interact with the sonar sensors. I designed the encasing using Rhinoceros 3D and laser cut the encasing from acrylic. I designed the acrylic to lock into place, but secured the assembly using acrylic glue. I mainly studied the interaction of salt, water, and oobleck.
Outcomes - The main outcome of this project was the examine how different liquids and solids reacted with sound waves. I wanted to focus less on a finished, complete product and more on the interaction of sound with different materials. I also wanted to examine how interactivity comes to play when interaction leads to visual movement. Overall, I thought that my project was successful and I did learn how different liquids and solids interact with different frequencies and pitches. I do believe I could have come up with a more complete end product, but I wanted to focus more on the interaction more.