Embody Paint
Wearables for creating real-time visualizations in virtual canvas through body movements mapping
EmbodyPaint transforms choreography into calligraphy, enabling users to visualize their movements on a virtual canvas and create unique digital art pieces. It captures body movements, such as rotation and speed, using sensor-embedded shoes to generate real-time visualizations in a web browser or augmented reality (AR). The system features wireless wearables, allowing users to move freely. This functionality is achieved by transmitting data to a server via WebSocket using Bluetooth or Wi-Fi protocols. ​
​
Advisor: Neil Gershenfeld (Founder of Center of Bits and Atoms)
Nathan Melenbrink (Harvard GSD), Iulian Radu (Harvard GSE)
Background: How To Make (Almost) Anything
In the fall of 2021, I took the course How to Make Almost Anything at MIT, where I developed the ability to integrate CAD, fabrication, electronics, and programming into a single project. With a background in industrial design, I have since become comfortable in the area of electronics (PCB design, fabrication, validation) and programming (input, output, interface, communication). This class was a catalyst in my transformation from a designer to a design engineer!
Idea: Visualizing body movement to create art piece on virtual canvas
The initial idea came from transforming choreography into calligraphy, allowing people to visualize their dance steps on a virtual canvas and create unique art pieces using wireless wearables. To enrich the visualization, additional parameters such as position, height from the ground, pressure applied to the ground, and shoe orientation can all contribute to the "painting." The ultimate goal is to enable multiple users to visualize their dance/art pieces in real time, creating a dynamic feedback loop where the visualizations influence their movements.