Playing with sound, video and images

This week for Computational Media, we learned how to manipulate pixels and sound so that we can play with images, video, and sound files.

I really enjoyed playing with the p5 FFT library, so I decided to build some visualizers using video as an input for the design of the visualizer.  It took me a little bit of time to understand the sound data, specifically what the output of FFT.spectrum meant. Once I understood the data, I just had to create visual effects and input the data as variables for these effects.

In order, the visualizers are Particles, Bars, Lines, SlitScan, and Stars.

The Particles visualizer is inspired from a Processing sketch I found on OpenProcessing. The Bars and SlitScan visualizers are edited from example code that we went over in class. Lines and Stars are evolutions of this code.

Thanks to Allison Parrish for helping me figure out the movement of the Stars visualizer. Props to Dominic Barrett for figuring out the drag and drop of files in p5.DOM. And thanks to Kanye for making dope music. Not using Waves with permission, but only for demo. Will take it down if need be.

You can play with it here.