WildWeb

The creation of an artificial ecosystem from our digital footprints

By Sauda Musharrat and Fuguo Xue

Date: December 20, 2023

We live in an age when we are consuming audio-visual content at an alarming rate. Social media has altered our behaviors by shortening attention spans and creating this demand for generating creative works that serve as a universal language for us. We envisioned this phenomenon in the digital world as having the potential to fuel artificial life. Modern AI generative tools which have stirred such outrage in recent days can exist due to the vast amount of accessible content on the web. So just as moss, weeds, and wild plant life find a way to grow and sustain a foothold in the most unpredictable places, we decided to capture natural phenomena and make them “come to life” through the digital footprints people live behind as they interact within social media. 

What was interesting about this journey was that even though Fuguo and I found such different inspirations from the random prompts in the card sorting activity at the beginning, we were able to combine the idea of nature and social media. There were some parts of this project we realized were crucial:

  • An easily recognizable natural element – tree, flower, grass, etc.
  • Generating content like a social media post
  • Interactions that involved manipulating the natural elements
  • Sounds of nature and technology mixed in a way that goes with the interactions

As we fixed a space on the IDM floor, our vision for the project kept changing. Furthermore, due to time, we had to prioritize certain aspects of the project which I am happy that we were able to achieve within the final critique timeline. Read more about this process in my previous post: saudamusharrat.com/final-project-prototyping/

All associated files can be found in this link:

https://github.com/musharrat37/WildWeb/tree/main

From one flower to five

After being able to bring one flower to life as detailed in the previous post it was time to create more of them. With the time remaining, we realized it would be wise to keep building on the parts of the prototype that have been successful so far. So we created 4 more wireframes, soldered all of them together, and rigorously tested the mechanics so that they held under pressure, stress, and gravity. Then we weaved the organza petals to attach them to the wireframes using 4 optical fibers for each petal that provided the support for it to stand on its own. We also tested various methods to get the maximum diffusion of light through the optic fibers. Using heat shrink tubes on the RGB LEDs and pushing the ends of the optic fibers through the opening allowed for the best results.

Weaving the optical fibers through the organza petals to attach with the wireframe and provide some structure
Preparing the LEDs for heat shrinking
After heat shrinking, I soldered three LED's together and extended the wires outward. I also had to clip the LED legs to avoid any shorting
Preparing the servo mounts

I had to attach the motor shaft arms to the flower and secure them by soldering. The arms were attached loosely so that it allowed for flexibility in converting the rotational motion to linear. Next, we created little enclosures for the motors to be fixed on with foamcore board and hot glue. 

Attaching the servo arm to the moving part of the flower wireframe
Making the mount for the servo motor. It had to be tight enough so that the motor was secure
The enclosure around the servo mount through which the base of the flower got fixed
Oh..so much wiring...and soldering

This was the part that took the longest. Only because I had to extend the wiring over a large space. I also had to make sure the connections were secure so I soldered them extra carefully and used black electrical tape to secure all the jumper wire connections.

Breathing Life to the flowers

This was the moment. This was when I got to see if all my long hours of meticulously making and checking all the connections had paid off. And it did!! I also used two push buttons to control the servos, which would serve as the like and dislike button. The idea is that the like button will make the two flowers on the right react “happily” and change color to green and the dislike button will make the 3 flowers on the left react “sadly” with a darker color change. Click here for the code

Connecting the whole thing to MAX

Since Fuguo worked with creating the social media content on MAX we now had to program the both the button changes to trigger new content to be generated. We used the Arduino to MAX serial communication patch to read the digital pins that was connected to the switches. There were some delays at first but eventually the communication between Arduino and MAX became faster as we tested it more.

Testing in the space

One issue we had with the projector display was that it was not bright enough. We then realized this served us well since the optic fibers would need darkness to be noticeable. We tested the whole set up again in the space.

We had to adjust the placement and size of the content box and the text so that we had space on either side for the flowers and the button bar in the middle
Assembling and finishing touches

Since we mounted all the flowers to large pieces of foamcore board, we used command strips to place the entire arrangement easily (no need to destroy any part of our hard work thankfully). The wiring was designed to simplify the assembly process so the only thing left for us was to create the scene with black fabric all around. This was the process that took us much time to tape and get together. The only issue was that the button bar was casting a shadow on the text part of the content. So we did our best to adjust the screen.

Reflections

The most rewarding part of this whole process was not only seeing our vision come to life but the way people responded to the piece. What really completed the piece was the sound. We combined nature sounds with noises we associate with technology interaction such as mouse clicks and notification tones. And the button controls also had a sad and happy sound feedback that the  audience really enjoyed.

Some interesting observations we had:

  • “I feel like I could just sit here and watch there is a calming and soothing effect”
  • The sound added to the cues we were trying to convey through the piece.
  • Many people were able to link digital content, artificial life and machine to the interaction as well as it being a commentary to nature which made us very happy.
  • We initially intended the buttons to be like and dislike. But the dislike was not very obvious. Someone took it as play button but it actually helped them relate the piece to social media.
  • The one major point to note was that the buttons were not obvious to the audience that those were interactive pieces, which is crucial since it is the point of entry to the piece. We need to work more on the buttons to make it seems like they are objects to press. Maybe have them pop out more or lit up like inviting round buttons. However, when people interacted with the buttons they noted there was satisfying feeling to the sound and feel. 
  • Viewers wanted to map the buttons to the flowers and keep trying to find new surprises in the interactive experience. One such surprise was the camera and how some of the visuals reacted to the viewers. 
 

Suggestions and discussion:

  • LadyK pointed out that there is “Something machine like in this piece….but some softness there too.” How can we  find a balance between the two – let it speak on its own?  This made me think about how adding more elements to the social media aspect can help, or making the space bigger yet keeping the subtlety to the movements of the natural elements.
  • “There is this feeling that it wants to jump out of the wall the fixed plane and surround us.” That was what we initially wanted – to make it as an immersive experience. But I feel like the ambience created a different effect which was soothing to witness by many people together. Everyone still had this urge to see more, to be encapsulated and have a bigger space to play on. 
  • People were very intrigued by the text and what the story it was trying to tell. The only problem was the button bar was casting a shadow over the text and made it hard to read. So some suggestion was to highlight it better and put it above the image rather than below. Even have a separate LCD screen for the text, which could be cool. 
  • The effect of constantly wanting to push the button and not really seeing the whole thing. Its the effect of social media and rapid content generating demand we were trying to demonstrate through the piece.
  • I liked hearing about the emotions it evoked in viewers, especially the sound cues. It made them feel like they were “doing something right” with the happy noise from the like button. I think there could be more to explore there also with the visuals, make the content seem more personal so that viewers don’t just focus on pressing the buttons.
  • The other interesting suggestion was the idea of projecting light or visuals onto the buttons and adding more complexity to the interactions. As people keep playing or interacting more and more, the natural elements gradually fade away or “die” in some sense.
 

Lastly, I want to end on the note that this wouldn’t have been possible without my teammate Fuguo who had to unfortunately go through the frustrations of physical computing troubleshooting for the first time under high stress. But her patience and perseverance helped us glue the piece together. I also want to thank LadyK for guiding us through this and the entire semester. I am incredibly happy with all that I have learned and achieved with her help and being inspired by my fellow, very creative classmates.