November 6, 2023
(This post has been updated on November 21, 2023. Scroll below for the updates after midterm critique)
Click here to access all the patch and media files
My progress up until now has been documented in the following posts sequentially:
TLDR: This is how I envision my project.
So far my biggest issues have been:
With all these issues, I realized I had to stop chasing perfection in my interactions and split the focus of the viewer so that they don’t distinctly notice the glitches and has an experience that conveys the message of the piece. So I worked on the second part of the project that deals with the side screen projection. Here a second camera tracks and captures planes of the viewer’s progress from the side and shows different variations of reality they are in.
Jit.glue really came in handy for this part. Right now I have three scenes in one window that changes according to the motion of a person. This is tracked by the hot zones example we learned in class. I am still working on some bugs to complete the interaction and responses.
When fixing many bugs that kept propping up, I had to improvise and deviate a little from my original plans. The 3 different effects I chose were either not showing up on the panels or affected the others. This may be due to my limited understanding of jit.glue and matrix manipulation. For example, I had to figure out pretty early on that for the same jit.glue object, all the video matrixes must have matching planes. So the final 3 effects are the following:
So as the viewer walks to the front of the screen showing the black hole interaction, the side view captures the journey of the viewer and their physical being breaking down, defying classical physics as they are nearing and eventually absorbed into the digital void.
When I set this up at the media commons space initially, I used a projector. However, adjusting this to the screen and making sure all the effects were captured without blurring was an issue. So I set this up later on a TV monitor and the result was much better. I could also easily set up the camera to capture the space better where the viewer will walk past.
Switching to face tracking instead of blob yielded a much better result. I had to use a different model provided within the openCV package to fine-tune the detection so that it remains stable. I subtracted the bottom coordinates from the top one to get the height of the box surrounding the face. For now, this has been calibrated to only one face detected within the frame. The change of height is fed through a scale between 0 to 1.0 and segmented into 3 sections to trigger the response to the viewer’s distance from the screen. I had to finalize the calculation once I had set up the projectors and screens to scale.
The following video is when I set up the project again at the media commons after midterm presentation. I used the large black screen in the 224 ballroom space and found it much easier to set up the projection. The only issue was, where do I set up the webcam? So I taped the camera in the middle and it seemed to blend in with the projection. A better way to set this up would be to cut a hole in the middle of the screen to hide the camera better but I would need a sturdy mount for it.
Next, I marked down the positions on the floor where the 3 different effects are triggered. In the initial set up, I had 3 laminated cards on the floor to indicate to the viewer to step on those spaces where they can see the changes in the front screen. But for this setup I marked those with black tape.
Now comes testing. I set up both screens and adjusted camera angles and calculations to ensure the experience on both sides was working properly.
Finally, I added some music for the ambiance that did not have any interactive response to it. It was mainly to have the viewers fully immersed in this experience within the space. I took a remix from the soundtrack of Interstellar and had it play when the interaction started.
Now comes the fun part. Even though we weren’t able to get videos on the day, I set this up again later (that’s when I got the videos). However, it was amazing to see the whole thing come to life as I imagined it (with a few adjustments) and see the whole class interact with the piece. These were some of the observations I picked up from my classmates:
Some of the questions I got:
Suggestions and discussion: