My biggest takeaway from this project was just being okay with being super open ended with what the end result would look like. A lot of our group’s brainstorming sessions was just throwing around broad terms like “… and then it’ll distort the video feed… the visuals respond to the audio”. I’m sure each of us had a different images of what that actually would look like in our head at the time. The final result usually was not at all what I had pictured due to a mix of not making the computer do what we wanted to do, and attempting to do everything remotely inside the confined space of zoom video feeds.
At first, I was confused since I both understood and had no idea how to actually make this “music video” work. I’m just so accustomed to music vids being just a plain video and not influenced live. The idea just seemed wayyy too broad. After going through the previous semester’s performances a few times I started getting a better handle of what the definition of the music video should be. The biggest realization was that not 100% of it had to be live. Some elements could be turned on and left to do it’s thing and it’d be mixed with other aspects that were controlled in real time.
This project also helped to hammer home the point that Max isn’t the project, it’s the tool. I pretty much didn’t do anything inside of max besides help to brainstorm some effects. Everything I did was trained in p5 and through OBS.