You are viewing this article in the archives. For the latest breaking news and updates in Ann Arbor and the surrounding area, see
Posted on Sun, Jan 27, 2013 : 5:59 a.m.

From kinesiology class to Gangnam Style: motion capture technology at Michigan 3D Lab

By Ben Freed

The directions were coming fast and furious.

“Roll your head!”

“Twist your wrists! Forearms! Whole Arms!”

Get Ann Arbor news in your inbox

Keep your finger on the pulse of the Ann Arbor business community by signing up for the Business Review newsletter sent out early every Thursday morning. You can also sign up for the daily 3@3 email to see our best local stories every day.

I was participating in a motion capture demonstration at an open house for the University of Michigan 3-D Lab located in the Duderstadt Center on North Campus.

Wearing a skintight black suit with 53 nib-like reflective markers attached, I was standing in a circle of cameras with bright red lights attached trying to concentrate and follow the instructions.

“Now twist your whole body! Good! Now left leg, front back side! Now the right!”

I was being "guided" through this series of motions by the lab's motion capture and visualization hardware specialist Steffen Heise.

Each of the eight cameras around me was trained to pick up the red light reflecting off my markers. Each camera was using those points of light to take a “picture” of me that would be combined with the seven other images to create a three-dimensional image.

Using that image, and once I was through with the light calisthenics, an animated character would appear on the screen and I would be able to control his motions.

“Now squat! Aaaand back to T-Pose!”

Just like that, I was calibrated.

For the next 20 minutes I proceed to shimmy, do the dance from “Gangnam style,” do the running man, and otherwise make a fool of myself while watching my real-time avatar do the exact same thing.

The motion capture system is one of a number at U-M, but it’s the only one that is, like everything else in the 3-D Lab, available to any student or faculty member.

“It’s really open to the general student population,” Heise said.


I tried to look as sophisticated as possible while trying out the motion capture suit.

Melanie Maxwell |

“We’ve had students do projects with sign language, we’ve had a Kinesiology class work with the technology. We’re a part of the library, so it works like the library. It’s shared resources that everyone can use.”

Uses for the technology at the university have ranged across disciplines, anyone attempting to capture a distinct movement could benefit from the system.

“We’ve had dancers in here before,” lab manager Eric Maslowski said.

“If there’s a certain dance move that someone wants to learn, right now they can either learn from video or from an older dancer. But that older dancer is going to have put their own little variation on it. By capturing exactly where each joint is supposed to go, we can help people learn exactly what they want to know.”

Maslowski said musical conductors also have used the technology to teach precise joint motion that can be difficult to examine using 2-dimensional video.

For my purposes, I was more interested in see what kind of crazy things I could make my animated character do. The “person” representing me on screen, who happened to only be wearing underwear, was developed as part of the Universal Character System that Maslowski developed.

Interactive imaging and production specialist Stephanie O’Malley, who has a background in video game design, created my particular character right at the lab. You could tell he was developed at U-M because his briefs were maize and blue.

“We’re developing different body types so people can see how different bodies can be affected by different motion,” she said.

One of O’Malley’s next projects is to be able to put more clothes on the characters. She already created one model that wore high heels so that a class could examine the effect of the shoes on the rest of the body’s joint movements.

The system wasn’t perfect. My animated self’s wrists constantly were cocked at odd angles, one of his/my feet consistently were flipped upside down (which looks really weird, trust me), and twice the computer became confused as to which side of my pelvis was the front, and which was the back.

This mix-up morphed me into an odd ultimate limbo character, with my rear grotesquely sticking out in front of me and my body bent over backwards.

Even with the minor glitches, there is a wide range of uses for the technology, from perfecting my free-throw motion (keep that elbow in!) to interacting with virtual worlds.

For the time being, however, I was content to make my avatar extremely fat and dance around waving my arms doing my best Tevya impression while singing “If I were a rich man” a little too loudly.

Ben Freed covers business for You can sign up here to receive Business Review updates every week. Reach out to Ben at 734-623-2528 or email him at Follow him on twitter @BFreedinA2


Kai Petainen

Sun, Jan 27, 2013 : 8:13 p.m.

cool stuff! somehow i'm thinking of Gollum, Tron and the Lawnmower Man ooo.... I should go and try it out.

Ben Freed

Sun, Jan 27, 2013 : 2:54 p.m.

Jimmy (and Tiny), thanks. It's been fixed. Ben


Sun, Jan 27, 2013 : 2:27 p.m.

I came to UM to get a Ph.D. in literature, and all you got was that lousy typo in the title.


Sun, Jan 27, 2013 : 1:17 p.m.

Hi Ben - before the PhD's in Literature attack, you've got a typo in the title.