There are many applications of motion capture. The most common are for video games, movies, and movement capture, however there is a research application for this technology being used at Purdue University in robotics development.
Video games Video games often use motion capture to animate athletes,
martial artists, and other in-game characters. As early as 1988, an early form of motion capture was used to animate the
2D player characters of
Martech's video game
Vixen (performed by model
Corinne Russell) and
Magical Company's 2D arcade
fighting game Last Apostle Puppet Show (to animate digitized
sprites). Motion capture was later notably used to animate the
3D character models in the
Sega Model arcade games Virtua Fighter (1993) and
Virtua Fighter 2 (1994). In mid-1995, developer/publisher
Acclaim Entertainment had its own in-house motion capture studio built into its headquarters. Motion capture also uses athletes in based-off animated games, such as
Naughty Dog's
Crash Bandicoot,
Insomniac Games'
Spyro the Dragon, and
Rare's
Dinosaur Planet.
Robotics Indoor positioning is another application for optical motion capture systems. Robotics researchers often use motion capture systems when developing and evaluating control, estimation, and perception algorithms and hardware. In outdoor spaces, it's possible to achieve accuracy to the centimeter by using the Global Navigation Satellite System (
GNSS) together with Real-Time Kinematics (
RTK). However, this reduces significantly when there is no line-of-sight to the satellites — such as in indoor environments. The majority of vendors selling commercial optical motion capture systems provide accessible open source drivers that integrate with the popular Robotic Operating System (
ROS) framework, allowing researchers and developers to effectively test their robots during development. In the field of aerial robotics research, motion capture systems are widely used for positioning as well. Regulations on airspace usage limit how feasible outdoor experiments can be conducted with Unmanned Aerial Systems (
UAS). Indoor tests can circumvent such restrictions. Many labs and institutions around the world have built indoor motion capture volumes for this purpose. Purdue University houses the world's largest indoor motion capture system, inside the Purdue UAS Research and Test (PURT) facility. PURT is dedicated to UAS research, and provides tracking volume of 600,000 cubic feet using 60 motion capture cameras. The optical motion capture system is able to track targets in its volume with millimeter accuracy, effectively providing the true position of targets — the "ground truth" baseline in research and development. Results derived from other sensors and algorithms can then be compared to the ground truth data to evaluate their performance.
Movies Movies use motion capture for CGI effects, in some cases replacing traditional cel animation, and for completely
CGI creatures, such as
Gollum,
The Mummy,
King Kong,
Davy Jones from
Pirates of the Caribbean, the
Na'vi from the film
Avatar, and Clu from
Tron: Legacy. The Great Goblin, the three
Stone-trolls, many of the orcs and goblins in the 2012 film
The Hobbit: An Unexpected Journey, and
Smaug were created using motion capture. In the 1990 movie
Total Recall, the sequence where
Arnold Schwarzenegger and other actors walk through an X-ray machine was shot using motion capture. The film
Batman Forever (1995) used some motion capture for certain visual effects.
Warner Bros. had acquired motion capture technology from
arcade video game company Acclaim Entertainment for use in the film's production. Acclaim's 1995
video game of the same name also used the same motion capture technology to animate the digitized
sprite graphics. The 1999 film
Star Wars: Episode I – The Phantom Menace was the first feature-length film to include a main character created (
Jar Jar Binks, played by
Ahmed Best), using motion capture. The 2000
Indian-
American film
Sinbad: Beyond the Veil of Mists was the first feature-length film made primarily with motion capture, although many character animators also worked on the film, which had a very limited release. 2001's
Final Fantasy: The Spirits Within was the first widely released movie to be made with motion capture technology. Despite its poor box-office intake, supporters of motion capture technology took notice.
The Lord of the Rings: The Two Towers was the first feature film to utilize a real-time motion capture system. This method streamed the actions of actor
Andy Serkis into the computer-generated imagery skin of Gollum / Smeagol as it was being performed. Storymind Entertainment, which is an independent
Ukrainian studio, created a
neo-noir third-person / shooter video game called
My Eyes On You, using motion capture in order to animate its main character, Jordan Adalien, and along with non-playable characters. Of the three nominees for the 2006
Academy Award for Best Animated Feature, two of the nominees (
Monster House and the winner
Happy Feet) used motion capture, and only
Disney·Pixar's
Cars was animated without it. In the ending credits of
Pixar's film
Ratatouille, a stamp appears labelling the film as "100% Genuine Animation – No Motion Capture!" Since 2001, motion capture has been used extensively to simulate or approximate the look of live-action theater, with nearly
photorealistic digital character models.
The Polar Express used motion capture to allow
Tom Hanks to perform as several distinct digital characters (in which he also provided the voices). The 2007 adaptation of the saga
Beowulf animated digital characters whose appearances were based in part on the actors who provided their motions and voices. James Cameron's highly popular
Avatar used this technique to create the Na'vi that inhabit Pandora.
The Walt Disney Company has produced
Robert Zemeckis's
A Christmas Carol using this technique. In 2007, Disney acquired Zemeckis'
ImageMovers Digital (that produces motion capture films), but then closed it in 2011, after a
box office failure of
Mars Needs Moms. Television series produced entirely with motion capture animation include
Laflaque in Canada,
Sprookjesboom and ''
in The Netherlands, and Headcases'' in the UK.
Movement capture Virtual reality and
augmented reality providers, such as
uSens and
Gestigon, allow users to interact with digital content in real time by capturing hand motions. This can be useful for training simulations, visual perception tests, or performing virtual walk-throughs in a 3D environment. Motion capture technology is frequently used in
digital puppetry systems to drive computer-generated characters in real time.
Gait analysis is one application of motion capture in
clinical medicine. Techniques allow clinicians to evaluate human motion across several biomechanical factors, often while streaming this information live into analytical software. One innovative use is pose detection, which can empower patients during post-surgical recovery or rehabilitation after injuries. This approach enables continuous monitoring, real-time guidance, and individually tailored programs to enhance patient outcomes. Some physical therapy clinics utilize motion capture as an objective way to quantify patient progress. During the filming of James Cameron's
Avatar all of the scenes involving motion capture were directed in real-time using
Autodesk MotionBuilder software to render a screen image which allowed the director and the actor to see what they would look like in the movie, making it easier to direct the movie as it would be seen by the viewer. This method allowed views and angles not possible from a pre-rendered animation. Cameron was so proud of his results that he invited
Steven Spielberg and
George Lucas on set to view the system in action. In Marvel's
The Avengers, Mark Ruffalo used motion capture so he could play his character
the Hulk, rather than have him be only CGI as in previous films, making Ruffalo the first actor to play both the human and the Hulk versions of Bruce Banner.
FaceRig software uses facial recognition technology from ULSee.Inc to map a player's facial expressions and the body tracking technology from Perception Neuron to map the body movement onto a 2D or 3D character's motion on-screen. During
Game Developers Conference 2016 in San Francisco
Epic Games demonstrated full-body motion capture live in Unreal Engine. The whole scene, from the upcoming game
Hellblade about a woman warrior named Senua, was rendered in real-time. The keynote was a collaboration between
Unreal Engine,
Ninja Theory,
3Lateral,
Cubic Motion,
IKinema and
Xsens. In 2020, the
two-time Olympic figure skating champion Yuzuru Hanyu graduated from
Waseda University. In his thesis, using data provided by 31 sensors placed on his body, he analysed his jumps. He evaluated the use of technology both in order to improve the scoring system and to help skaters improve their jumping technique. In March 2021 a summary of the thesis was published in the academic journal. ==Methods and systems==