< Back

Project-04 Movement Animation

Week01_context_Concept

The overall context is using the IoT, the accelerometer sensor and built-in camera. Together, these data will be projected on the screen and create data visualization.

My concept is that by tracking the movement of a person through the camera, machine learning library for body recognition and Triple-Axis Accelerometer Sensor, it will translate the signals to code and visualize it through amazing graphic animation.

IMPETUS

One of the reasons is that I was working on visualize audio through Project 3, and I think that why not extend this participation in body movements? So I decide to work on visualizing the movement. People can have a whole experience when they are dancing to music.

On the other hand, many people are interested in motion, whether it is gesture dance, K-pop dance, or just funny facial expressions with face filters on TikTok. If I can help them visualize these movements through attractive graphics, it will benefit both the creator and the audience. It will be a new trend for the creator and feel more engaged during their creation. For the audience, it brings double visual enjoyment.

Week02_Building Process

Goals

The goal that I want to achieve is to track the user's body movements when they are in front of the camera. It detects the user's different joint parts, reviews them, and transfers them to the screen. It is also essential to match the actual physical movements and visual graphics. In my concept, the user should wear at least one sensor on their body. The combination of software and hardware is what I am interested in. Besides it, it will give more information or animation on the visual. In this case, they are computer build cameras and accelerometer sensors.

Precedents

StoryBoard

I have created a simple storyboard to show my general idea and what steps to take to reach my final project. It tells a story about how a user gets inspired on TikTok and then creates the movement animation through all these processes.

Week_03
PROTOTYPING PROCESS-1

PROTOTYPING PROCESS-2

In Project 3, the Breadboard will not be necessary. However, I do not buy any wires F-F. So I have to use the Breadboard to connect the photon board and Accelerometer Sensor. After I set up the sensor as the tutorial, the green LED lights will light up. Then I use wires/ ropes to tide the Breadboard on my waist. I need to stand up and show my whole body to the built-in computer camera. If I get too close, the system will be confusing and unable to figure out my body position. The Accelerometer Sensor can only be activated by slide along the z-axis, so I plan to place it on my wrist, like the Hip-hop wristbands. I use the hairband to fix the sensor on my hand. It turns out pretty good.

Final Project_Visualization

Final Project_Movement Animation

Future development

In this project, I only visualize the body as a whole. In my next step, I want the machine can learning more detailed body part. So I can visualize different body parts for the dancer to use. I only utilize the z-axis to create the animation due to the time-consuming. In the future, I will definitely x and y-axis. This will help me to make the 3D animation instead of graphic animation.

Through the study of this project, I find out that the combination of hardware and software can create miracles. This will be the future trend and significantly extend the user's interaction experience beyond the screen and physical object or even their bodies.