Date: October 08, 2023
Click here for the MAX patch and related files
My approach to understanding Jitter was finding parallels with how I previously learned to use the OpenGL library. The relationship between pixel manipulation and the geometry used for graphics was crucial for me to crack before I could ideate my final product for this challenge. My plan here is to:
I started to play around with the various attributes of the jit.world and jit.movie object. First, I created a mySpace instance in the jit.world. Without declaring this object my movie refused to play. Next, I set up the jit.movie object to read my base and effects video file with no volume since I am going to try to add an external audio file later.
I also added jit.grab to later capture video feed from my camera and add it to the sequence of my movies. I just need to figure out how to time the sequence of videos first.
With my first two videos, I am using jit.op to choose different operations such as addition, multiplication, average etc. to make different effects for my movie.
I wanted to create a second video that took live feed using jit.grab, capture it as a matrix and manipulate it. I wanted a slightly pixelated effect to the live video, so I reduced the matrix dimensions to 100×100. Next I wanted to incorporate an effect of a person “glithching into the matrix world”. The chromakey effect was the best option to play around with. I also kept a suckah object to click around the first image to determine where the second video would fill/color the first one. I also added a few presets to see which settings would give me the effect I desired for my final video.
The story for my movie was slowly coming together. It keeps changing based on my knowledge of Jitter but I am having fun with the different scenes I am building. I wanted to place the second video with the jit.grab and chromakey as the beginning of my story and needed a transition video to glue on with the first video.
I wanted to generate a random matrix with some math functions and took help from a few tutorials online to generate a wave function in the multislider object and visualize it through a matrix. Since my matrix dimensions are 16×16, I set an Uzi object to trigger 256 bangs and index. Finally, I used the jit.fill object to send the list to my matrix named “meme” in the second channel, so that the pixels appear green to match with my matrix world.
I threw in a second video mix to end my movie, made in a similar way to the first mix but with xfade controlled by a slider. I also used the jit.scalebias object to manipulate video parameters so that it blends well with the previous video.
I wanted to control in which sequence the videos will play from the keyboard. Numeric keys 1, 2, 3, and 4 have been set with ASCII code matching with the select object. Using the toggles I can also turn off the videos so as not to overwhelm MAX memory.
Finally, I created a mix of 4 videos that tells the story of a person getting sucked into the matrix and into a portal that ejects them into the cosmos. I had to set a preset for the numeric entries to simplify the process but playing the sequence of videos will require user input to play together. I also called a jit.window object, and named it mySpace to project the movie all in one window.
I wanted to incorporate audio with the videos and make a visualizer after tying it all together. However, after playing all the videos, there seem to be significant lags and not enough smooth transitions between videos. I could not finish debugging all the issues. Final points that need work: