Motion Capture Project – Erin Lee

My initial visual inspiration came from the music video of Light it Up (by Major Lazer). The producers of the video, Method Studios, used motion capture techniques to record movement in physical space, converted the data into a digitally storable file, and attached 3d garments and character onto the original bone/spline structure. I especially appreciated the interesting combination of humanly movements with visual elements that seemed digitally replicated; I wished to challenge myself to animate a character from top to bottom, and to do so I signed up for a training session at the IYA XR studio to use their motion tracking bodysuit. 

I visited the IYA XR lab and learned how to man the Shogun motion capture station. I tried on the body suit and got training for how to use the T-shaped lighting stick to activate the motion capture cameras in the station. I tried various poses and dynamic dances as an experimental approach to test the quality and potential of the program. Lars, a graduate student at the Cinematic School’s Animation department, helped me understand why certain motions needed clean-ups in the editing process due to the incomplete recognition of motion in computer programs.
When I recieved my test files from Lars, I was bummed to realize that the file types that were given were not compatible with the 3D softwares I was using (Blender, Cinema4D). I searched online the possible ways in which I could convert the file to an fbx so that the programs would recognize the data, though the only answer I got was that the file had to specifically be exported into an fbx file from Shogun. Planning on visiting the station again in the future for the preferred file type, I decided to resort to using Mixamo’s library of human motion capture.

Here is my very first experimentation with the Mixamo motion capture. I used < Breakdance Freeze Var 1 > and downloaded the .fbx file. I brought over the data to cinema 4D, in which I replicated the visuals of Major Lazer’s music video dancers by adding a simple hair material to the existing flesh of Mixamo’s character.

I wanted to push this further by adding abstract elements surrounding the dancing man to make the whole scene look a lot more like a performance within a real music video. I 3D modeled the assets to create organic shapes and patterns, added them onto the scene, and assigned unique materials that complemented the fur of the dancing man. Upon exporting this sequence, I decided to incorporate audio into the project and matched the movement to music.

Since my motion capture data from the IYA XR lab was not transferrable, I decided to capture my own motion through the AI feature provided by Rokoko Vision on the browser. I found a plain background where I could fully capture myself in, and I performed simple motion in front of the camera so it would record the movement. Right away, I noticed a significant limitation of the use of single-camera in the process of motion capture, as the result of my movement turned out to be rather two-dimensional. Not only were some joints tacky in movement, but sometimes the AI would recognize motion that is unnecessary, or be completely oblivious to important parts of the movement.

I brought over the data to cinema 4D, in which I again assigned and experimented with different materials to give personality to my character. This time, instead of using fur (which made the character overly dynamic in motion), I used the glass material to create an eerie illusion of a fragile glass man dancing.

This is the glass man that is fully rendered in animation. I definitely think that my Mixamo animations were a lot more dynamic, powerful, and natural in motion, but I also note that those motion capture files were made with multiple, high-quality motion cameras with specialized body suits. Considering the glass man was made with data that was captured through a single computer camera, I am pleased with how smooth the animation turned out to be, and overall how much useful knowledge I gained regarding the process of motion capture and the application of motion capture data into artistic venues.