Project:

I came across a project called The Infinite Drum Machine made by Google Creative Lab. I thought it was really cool because instead of using typical drum sounds, it lets you explore and play with thousands of everyday sounds—things like a door creaking or a glass breaking—and turn them into beats. The creators used a machine learning algorithm called t-SNE. I learned that t-SNE is a way of taking really complex data—in this case, audio features like tone and rhythm—and mapping it into a simple 2D space. That’s why, when you look at the project, you see all the sounds laid out on a colorful map, with similar sounds grouped close together. The data behind it was basically a huge library of audio clips, including contributions from the London Philharmonia. I imagine the algorithm took those sounds, broke them down into their audio features, and then clustered them in a way that feels natural to our ears. I think the creators chose t-SNE because it makes relationships between sounds more intuitive and visual. It doesn’t require someone to manually label what “type” of sound something is. it just learns the similarities automatically. That makes the project feel more playful and exploratory, which is exactly the point: you get to wander around this sonic landscape and stumble on sounds you wouldn’t normally think to use.


Documentation for the HandPose Assignment:

For this assignment, I used HandPose with p5.js to create an interactive sketch where my hand controls both visuals and sound. I started by referencing the example code we had for particles following the index finger. That gave me a base structure to detect hand landmarks and spawn visuals. From there, I experimented by adding new interactions, like mapping the wrist’s x-position to color, using pinch detection (thumb + index) to trigger a quick synth pop, and making the openness of my hand change the particle size and reverb effect.

One of the main challenges I faced was just getting HandPose to work smoothly with the video input. At first, I wasn’t sure if my results array was coming through correctly, and I had to log out the hand keypoints a few times to figure out which points matched the thumb, index, or wrist. Another challenge was connecting sound in a way that didn’t feel overwhelming—at first my synth would play continuously, which was kind of chaotic. I had to learn how to use an envelope so the sound only triggered when I pinched my fingers together.

I also had some trouble balancing the visuals and the sound so that they felt connected. For example, the pinch gesture not only plays a sound now but also makes a burst of particles, which gave me a stronger feeling of “cause and effect.” Looking back, I’m glad I leaned on the previous particle system code as a reference point, because it gave me a solid foundation to build more surprising interactions on top of. Overall, this process helped me understand how to combine HandPose output with both visual mapping and sound design, and I felt like I was able to push it into a creative space rather than just repeating the demo.

https://editor.p5js.org/Anna_Tang/sketches/NQBqfgyNo