For the final project, Fuguo and I decided to team up because
a) Two heads are faster than one, especially within the short timeline we have and
b) We wanted to integrate live visuals with physical computing
With Fuguo’s passion for live coding and visuals and mine for physical computing, we got to card sorting.
Fuguo and I decided to move forward with combining the ideas for SynthNature and WayOut. SynthNature lacked a message and a motive to live, which we can provide from the concept of WayOut. It is a perfect metaphor of nature being shaped by human activities, but here we consider only the digital experiences that shapes so much of our world and social interactions today. We envision the interactions in our project to be translated from social media activities and our online existence to create a surreal interpretation of nature. The idea of synesthesia can be incorporated in this interpretation as we dictate the creation of new sensory experiences. Fuguo went ahead and researched the various ways we can go about this -> click here for her documentation
My focus was to research the technical aspects of the physical computing elements we are planning in incorporating. The following is my brainstorming process (which is a mess):
So after an initial explosion of ideas, I decided on trying to separate meaningful actions of social media into interactive inputs and outputs. For example,
So what will it look like? To make sure we don’t have to take apart the fruit of our hard labor, we will construct a movable lightweight frame like privacy screens that have wheels. The idea is to create 2-3 walls of different shapes with roofs to create an immersive space that can be rearranged anywhere. This will allow us to use prototyping materials such as cardboard or paper to quickly create forms or shapes. Instead of a screen, we will use fibre optic fabric, addressable LEDS and rear projection to create our visuals.
Now the challenges and next steps…