Midterm Prototype

Approaching the Digital Black Hole

November 6, 2023

(This post has been updated on November 21, 2023. Scroll below for the updates after midterm critique)

Click here to access all the patch and media files

My progress up until now has been documented in the following posts sequentially:

TLDR: This is how I envision my project.

So far my biggest issues have been:

  1. Blob detection. I was using the size of the largest blob increasing to track the viewer coming forward. However, the moment I set up the camera in the Media Commons space, it stopped working the way I prototyped on my laptop. I tried changing the light settings and even considered setting up everything in front of a green screen. But blob detection was still glitchy over a big space as it kept picking up more blobs as the viewer kept moving. So I switched to face tracking just so that openCV only works with a human. I am tracking the size of the bounding box on the face to control my interactions now. It’s not very stable at the moment, but it is proving to be much more effective than blob.
  2. The space. When I work on my laptop and integrated webcam, the motion is small enough to track and manage. But the moment the camera expands the view, I have to redo all my calculations. So any changes I make are not that simple.
  3. Window sizing. Making my jit.window fullscreen throws off the ratios and I have to redo all my calculations.

With all these issues, I realized I had to stop chasing perfection in my interactions and split the focus of the viewer so that they don’t distinctly notice the glitches and has an experience that conveys the message of the piece. So I worked on the second part of the project that deals with the side screen projection. Here a second camera tracks and captures planes of the viewer’s progress from the side and shows different variations of reality they are in. 

Splitting screens: Side view interaction

Jit.glue really came in handy for this part. Right now I have three scenes in one window that changes according to the motion of a person. This is tracked by the hot zones example we learned in class. I am still working on some bugs to complete the interaction and responses.

Final Side view interaction

When fixing many bugs that kept propping up, I had to improvise and deviate a little from my original plans. The 3 different effects I chose were either not showing up on the panels or affected the others. This may be due to my limited understanding of jit.glue and matrix manipulation. For example, I had to figure out pretty early on that for the same jit.glue object, all the video matrixes must have matching planes. So the final 3 effects are the following:

  1. Glitchy live stream mixed with a video of a crowd to demonstrate the viewer existing on a different plane/dimension than others
  2. Upside down live stream – to show how classical physics no longer holds
  3. Live stream mixed with a cube matrix video to show how the viewer is extending beyond the 3D  physical space.

 

So as the viewer walks to the front of the screen showing the black hole interaction, the side view captures the journey of the viewer and their physical being breaking down, defying classical physics as they are nearing and eventually absorbed into the digital void.

When I set this up at the media commons space initially, I used a projector. However, adjusting this to the screen and making sure all the effects were captured without blurring was an issue. So I set this up later on a TV monitor and the result was much better. I could also easily set up the camera to capture the space better where the viewer will walk past.

Final Front view interaction

Switching to face tracking instead of blob yielded a much better result. I had to use a different model provided within the openCV package to fine-tune the detection so that it remains stable. I subtracted the bottom coordinates from the top one to get the height of the box surrounding the face. For now, this has been calibrated to only one face detected within the frame. The change of height is fed through a scale between 0 to 1.0 and segmented into 3 sections to trigger the response to the viewer’s distance from the screen. I had to finalize the calculation once I had set up the projectors and screens to scale.

The following video is when I set up the project again at the media commons after midterm presentation. I used the large black screen in the 224 ballroom space and found it much easier to set up the projection. The only issue was, where do I set up the webcam? So I taped the camera in the middle and it seemed to blend in with the projection. A better way to set this up would be to cut a hole in the middle of the screen to hide the camera better but I would need a sturdy mount for it.

Next, I marked down the positions on the floor where the 3 different effects are triggered. In the initial set up, I had 3 laminated cards on the floor to indicate to the viewer to step on those spaces where they can see the changes in the front screen. But for this setup I marked those with black tape.

Testing, testing...

Now comes testing. I set up both screens and adjusted camera angles and calculations to ensure the experience on both sides was working properly.

Finally, I added some music for the ambiance that did not have any interactive response to it. It was mainly to have the viewers fully immersed in this experience within the space. I took a remix from the soundtrack of Interstellar and had it play when the interaction started.

Notes from Midterm Critique

Now comes the fun part. Even though we weren’t able to get videos on the day, I set this up again later (that’s when I got the videos). However, it was amazing to see the whole thing come to life as I imagined it (with a few adjustments) and see the whole class interact with the piece. These were some of the observations I picked up from my classmates:

  • “I feel like I am inside a music video”
  • “There is something interesting happening with the 3 screens on the side”
  • “When I am watching from here (away from the interaction) I see everything. But when you walk there’s a lot going on”
  •  The audience experience seems different from the participants
  • The side view seems more exciting (probably because there was more exploration there, people trying to figure out what was happening since the responses were not as obvious)
  • The understanding of the 4D space was apparent. Someone even drew a connection to the tesseract of the bookshelves shown in the movie (even though I did not intentionally put anything there in reference)

Some of the questions I got:

  • What were the sensors that triggered all these responses?
  • Should I know where to step? 

Suggestions and discussion:

  • The music played a big role in the discussion. It’s interesting how even just the choice of music added a whole unexpected layer to the experience. There were suggestions of making the music more reactive to the interaction, such as the music fading as the viewer gets pulled into the black hole. I think for some people it added a layer of emotion that I feel could have distracted from the visual experience. But it also made me think about how the sound response could also be used to make the piece more accessible.
  • The other big suggestion was around extending the space. The front view could be an oval shape (like a black hole), projected in front of a curtain so that people can walk out and not have to go back, completing the experience of being sucked into the black hole and not having a pathway to return. There was also the suggestion of a tunneling effect, and having bigger screens to make the experience more immersive. The idea here is to tease out and prolong the experience, and work with a bigger space.
  • Another key observation from LadyK was the idea of deviating from the inspiration and making the piece my own. Having many elements of the movie can tie to the movie itself, rather than making it a unique creation of my own. It is not an Interstellar piece, rather, it is my own interpretation of time and memory in the digital space. I have to make this message more apparent.
  • I feel like having to explain what the piece is doing or what it means takes some of the magic out of the experience. Going forward I am going to remember this.
  • Lastly, one unexpected thing that kept happening was the glitching and flashing effect. It seemed to intensify the more I played with the lights of the room. This could be a possible issue for people with certain sensory impairments. But lighting seemed to be the biggest problem I had to tackle overall with using camera-based sensors.