By Sauda Musharrat and Fuguo Xue
Date: December 20, 2023
We live in an age when we are consuming audio-visual content at an alarming rate. Social media has altered our behaviors by shortening attention spans and creating this demand for generating creative works that serve as a universal language for us. We envisioned this phenomenon in the digital world as having the potential to fuel artificial life. Modern AI generative tools which have stirred such outrage in recent days can exist due to the vast amount of accessible content on the web. So just as moss, weeds, and wild plant life find a way to grow and sustain a foothold in the most unpredictable places, we decided to capture natural phenomena and make them “come to life” through the digital footprints people live behind as they interact within social media.
What was interesting about this journey was that even though Fuguo and I found such different inspirations from the random prompts in the card sorting activity at the beginning, we were able to combine the idea of nature and social media. There were some parts of this project we realized were crucial:
As we fixed a space on the IDM floor, our vision for the project kept changing. Furthermore, due to time, we had to prioritize certain aspects of the project which I am happy that we were able to achieve within the final critique timeline. Read more about this process in my previous post: saudamusharrat.com/final-project-prototyping/
All associated files can be found in this link:
After being able to bring one flower to life as detailed in the previous post it was time to create more of them. With the time remaining, we realized it would be wise to keep building on the parts of the prototype that have been successful so far. So we created 4 more wireframes, soldered all of them together, and rigorously tested the mechanics so that they held under pressure, stress, and gravity. Then we weaved the organza petals to attach them to the wireframes using 4 optical fibers for each petal that provided the support for it to stand on its own. We also tested various methods to get the maximum diffusion of light through the optic fibers. Using heat shrink tubes on the RGB LEDs and pushing the ends of the optic fibers through the opening allowed for the best results.
I had to attach the motor shaft arms to the flower and secure them by soldering. The arms were attached loosely so that it allowed for flexibility in converting the rotational motion to linear. Next, we created little enclosures for the motors to be fixed on with foamcore board and hot glue.
This was the part that took the longest. Only because I had to extend the wiring over a large space. I also had to make sure the connections were secure so I soldered them extra carefully and used black electrical tape to secure all the jumper wire connections.
This was the moment. This was when I got to see if all my long hours of meticulously making and checking all the connections had paid off. And it did!! I also used two push buttons to control the servos, which would serve as the like and dislike button. The idea is that the like button will make the two flowers on the right react “happily” and change color to green and the dislike button will make the 3 flowers on the left react “sadly” with a darker color change. Click here for the code
Since Fuguo worked with creating the social media content on MAX we now had to program the both the button changes to trigger new content to be generated. We used the Arduino to MAX serial communication patch to read the digital pins that was connected to the switches. There were some delays at first but eventually the communication between Arduino and MAX became faster as we tested it more.
One issue we had with the projector display was that it was not bright enough. We then realized this served us well since the optic fibers would need darkness to be noticeable. We tested the whole set up again in the space.
Since we mounted all the flowers to large pieces of foamcore board, we used command strips to place the entire arrangement easily (no need to destroy any part of our hard work thankfully). The wiring was designed to simplify the assembly process so the only thing left for us was to create the scene with black fabric all around. This was the process that took us much time to tape and get together. The only issue was that the button bar was casting a shadow on the text part of the content. So we did our best to adjust the screen.
The most rewarding part of this whole process was not only seeing our vision come to life but the way people responded to the piece. What really completed the piece was the sound. We combined nature sounds with noises we associate with technology interaction such as mouse clicks and notification tones. And the button controls also had a sad and happy sound feedback that the audience really enjoyed.
Some interesting observations we had:
Suggestions and discussion:
Lastly, I want to end on the note that this wouldn’t have been possible without my teammate Fuguo who had to unfortunately go through the frustrations of physical computing troubleshooting for the first time under high stress. But her patience and perseverance helped us glue the piece together. I also want to thank LadyK for guiding us through this and the entire semester. I am incredibly happy with all that I have learned and achieved with her help and being inspired by my fellow, very creative classmates.