Time to complete: 4 Months
My Role: Engineer – Robotics, Installation, Fabrication
Team: Independent
Tools used: Arduinos, Motors, 3D Printing, Projection mapping
Installation space: NYU 370 Jay Street
Time to complete:
My Role:
Team:
Tools used:
Installation space:
4 Months
Engineer – Robotics, Installation, Fabrication
Independent
Arduinos, Motors, 3D Printing, Projection mapping
NYU 370 Jay Street
Cybernetic Echoes is an interactive installation that flips the script on how we connect with technology. Instead of teaching our bodies to adapt to screens, what if tech adapted to us—our movements, instincts, and presence? As digital devices become extensions of ourselves, this piece explores what it means to design for our evolved, cyborg-like bodies. With wearable elements that “talk” to the environment, the space invites visitors to move, explore, and rediscover interaction through embodiment—not buttons.
This installation stemmed from my research into the creative process and our unease with generative AI in the arts. I asked creatives—artists, designers, musicians—what their dream tools would be like beyond the screen. Unsurprisingly, they all turned to their hands and bodies to ideate.
These findings brought to mind an article by Bret Victor, A Brief Rant on The Future of Interaction Design, where he reminds us that our amazing hands do more than just swipe screens. Inspired by that, I created a space where the rules of reality go a little sideways. It’s a peculiar little world that asks: what if physics got weird, and we had to re-learn how to create with what this new world gives us?
I envisioned the digital bleeding out into the physical, where the limiting 2D screen breaks apart pixel by pixel and materializes into our reality in a tangible form. I imagine each pixel communicating through physical forms, a body, that would respond to our bodies. That is why the idea of creating a version of nature was appealing, to juxtapose the organic lifeforms around us mimicking the digital, the calculated, the manmade creations.
The above picture is a sketch from my mental storyboard, which I fed into AI video generation prompts in Pika. I needed elements from the digital to be recreated as tangible interactive elements, to serve as an interpretation of Extended Reality (XR). The final installation, shown below, was designed based on this generated video loop. I also photographed real robotic flowers, fabrics, and fiber optics, then superimposed them into the projected visuals to blend the physical and digital worlds.
The audience journey laid down the foundation of the interaction design in the space. Gestures made through the wearable control piece were mapped to the placement of the elements in the installation space.
The key technology here is a pair of BLE that communicate wirelessly from the user, wearing the control piece, to the interactive elements in the environment. These elements are motors and lights from the robot flowers that indicate a response to the gestures from the user. The robotic elements are mapped within a matrix of computational bits that are mapped to the space representation in programming the microcontroller in the wearable. The IMU sensor on the wearable control piece will turn on each panel based on movement in the mapped spaces the user is in.
The wearable control piece took two forms to allow for accessible use by different kinds of users. The 3D-printed IRON MAN glove-style piece was the most effective at relaying the feel of a cyborg. The cuff-style fabricated piece suited users with restricted fine motor movement. The control piece can be easily switched out for the user’s preference to demonstrate the embodiment of technology that can blend seamlessly with our personality and means of self-expression through clothing.
Snapshots (above) documenting the process of building the robotic flowers, the setup for the environment, and a pneumatic system designed to give the illusion of a living, “breathing” environment (below). After finalizing the mounting mechanisms, the final step was to drape the entire setup with fabric to match the ambience of the environment and the projected visual.
SPACE & STORY – (PERCEPTION): Were the people able to understand any part of the story? What did they get out of it?
Participants felt like they’d stepped into a sci-fi world—some even said putting on the wearable felt like gaining a superpower. Many created their own stories and goals in the space, showing how much people crave agency and personal meaning in interactive experiences. Interestingly, it wasn’t just the tech that left a mark—the textures, colors, and organic feel of the space really stuck with them. And while users enjoyed the sense of control, they also responded to the unexpected, which sparked reflection on our role in larger systems. It’s a good reminder: great experience design blends emotion, story, surprise, and sensory detail.
Using Tech for Creative Expression – How did people respond to the interaction and technology? Could they create something, explore, and play around?
People responded with curiosity, movement, and imagination. Even those unsure how the tech worked felt inspired to dance or explore, showing that the system tapped into something intuitive. In the installation’s cyber-dystopian-meets-cave-painting vibe, participants built personal connections with the interactive elements, revealing how creativity often comes from emotional engagement, not just functionality. From a UX research perspective, this highlights how embodied play and open-ended environments can spark genuine creative expression.
Exploration through Embodiment – Were they able to use their bodies to interact with the tech? To what extent?
Yes, participants used their bodies to interact, but it raised important questions about accessibility. I realized sensors like accelerometers may limit who can fully engage, while motion or sound-based inputs offer more inclusive options. One researcher asked, “How do we design for the depth already in our bodies?”—a powerful reminder that tech should amplify human potential, not define it. For UX, it’s about designing for all bodies, not just the most able ones.
Affordance in Experience Design – How did participants perceive the interaction and the interactive elements? What did they miss?
Participants perceived the interaction as mysterious but intriguing—while they didn’t always understand how it worked, that ambiguity sparked curiosity and playful exploration. The wearable glove, designed like a costume, made it easy to step into the experience without overthinking the tech; people cared more about feeling immersed than decoding LED feedback. Though some affordances were intentionally subtle, this openness let participants form their own relationships with the space. For HCI, it highlights how affordance doesn’t always need to be explicit—vague but inviting cues can still foster meaningful, embodied interaction.