Video Playback System

The Assignment for this week was to begin building a video playback system. My goals were to get more comfortable with a lot of the objects and processes that we had gone over in class but also to try experimenting and seeing what else was possible.

Most of the issues I had were that I’m still not totally sure how a lot of objects work, such as chromakey and xfade, so understanding why the output acts the way it does is a bit difficult. I find the visual programming to be quite intuitive and fun though. The major issue I had was that, even using window instead of pwindow, when I loaded in videos to the program the framerate dropped significantly and there was a lot of lag. This is why I used live webcam footage for my documentation.

The patch that I created is broken up to four different parts.

The first part of the patch is really responsible for unpacking the video’s values and then sending the rgb out to the second portion. Once they return from part they are repacked and sent into a gswitch for toggling between color and b&w. the second portion of the patch manipulates the zoom levels and anchor points of the first live video before sending it back. For this portion, Matt Romein’s sample patches from week 2 were used. Part 3 uses chromakey and rota to manipulate a second video feed. Finally the fourth part used xfade to fade the two videos together as the user would like.

Screen Shot 2019-02-14 at 8.18.22 AM.png

Part 1:

Screen Shot 2019-02-14 at 8.26.58 AM.png

Part 2:

Screen Shot 2019-02-14 at 8.25.54 AM.png

Part 3:

Screen Shot 2019-02-14 at 8.27.05 AM.png

Part 4: