Time to complete: 6 weeks, Fall of 2023
My Role: Engineer – Physical Computing, Fabrication
Team: An Engineer and a Visual Designer
Tools used: Microcontrollers, MAX/MSP, Hydra
Installation space: NYU 370 Jay Street
Time to complete:
My Role:
Team:
Tools used:
Installation space:
6 weeks, Fall of 2023
Engineer – Physical Computing, Fabrication
An Engineer and a Visual Designer
Microcontrollers, MAX/MSP, Hydra
NYU 370 Jay Street
Social media has altered our behaviors by shortening attention spans and creating this demand for generating creative works that serve as a universal language for us. We envisioned this phenomenon in the digital world as having the potential to fuel artificial life. Modern AI generative tools which have stirred such outrage in recent days can exist due to the vast amount of accessible content on the web. So just as moss, weeds, and wild plant life find a way to grow and sustain a foothold in the most unpredictable places, we decided to capture natural phenomena and make them “come to life” through the digital footprints people live behind as they interact within social media.
This piece was the culmination of me and my teammate Fugue Xue’s passion for creating an immersive interactive experience using our respective skills in physical computing and live coded visuals. To select a theme we followed a card sorting technique that allowed us to play with words which reflected our interests. We came up with descriptors and enablers for the themes and jumbled up the latter two cards to land on weird ideas.
We then voted on which ones we both favored and had four ideas to present to peers at a roundtable discussion. The discussions were focused on critiquing the interactive elements of the installation, what aligns with the story and which one would be most intriguing to experience.
Peer reviews yielded insights such as, SynthNature lacking a message and a motive to live, which we can provide from the concept of WayOut. It is a perfect metaphor of nature being effected by humans, but here we envision the interactions in our project to be translated from social media activities and our online existence to create a surreal interpretation of nature.
There were some parts of this project we realized were crucial:
We decided on representing the aspect of social media with a projection of a digital screen as it displays a post in a platform such as Instagram, with an animated visual and caption. The reactive physical elements takes the form of mechatronic flowers that will glow and bloom based on liking or disliking the post.
Finalizing the space for the installation directed our design a lot. The space resided between a glass corridor with a projection screen and an ultra short throw projector mounted to the ceiling. Which provided us with a new set of possibilities and challenges.
We went an extra mile to get inspiration by attending the LIVE! CODE! Workshop at CMU in Fall 23. We incorporated Hydra visuals coded at the workshop in the social media posts to project surreal nature effects that react in real-time to the viewer. We patched these visuals within Max/MSP to appear prompted by the Like and Dislike buttons that the audience can react with. The caption texts were also generated for each interaction by a poetry AI generator plugin in Max.
The first step in creating the flowers was to provide a mechanical structure for animating the flower. I built a wire base to mimic opening and closing the flower. The base was then attached to servo motors programmed to rotate 0 to 180 degrees for every button press.
Next we needed to construct the petals in a way that was lightweight, held its own structure, and had light effects. The petals were made of dark organza since it is lightweight, easy to weave fiber optics into, and highlights the colors within the fiber optics.
The flowers were mounted to servos and positioned on boards that would later be installed on the walls.
After connecting all the servos to the buttons, we tested out the movement of the flowers. Then we rerouted the Arduino serial communication through Max so that the buttons simultaneously trigger the change of digital content and the servos.
A more detailed documentation can be found here >> link
Since we mounted all the flowers to large pieces of foamcore board, we used command strips to place the entire arrangement easily (no need to destroy any part of our hard work thankfully). The wiring was designed to simplify the assembly process so the only thing left for us was to create the scene with black fabric all around.
The most rewarding part of this whole process was not only seeing our vision come to life but the way people responded to the piece. What really completed the piece was the sound. We combined nature sounds with noises we associate with technology interaction such as mouse clicks and notification tones. And the button controls also had a sad and happy sound feedback that the audience really enjoyed.
Some interesting observations we had:
All associated files to this project can be found here