Translating body movements with artificially intelligent motion capture

Researchers created LEAP, a flexible motion-capture tool that can be trained in a matter of minutes to track body parts over millions of frames of existing video with high accuracy, without any physical markers or labels.



Princeton researchers’ new tool, LEAP Estimates Animal Pose (LEAP), can be trained in a matter of minutes to automatically track an animal's individual body parts over millions of frames of video with high accuracy, without having to add any physical markers or labels.


When the researchers combine LEAP with other quantitative tools developed in their labs, they can study what they call "the language of behavior" by observing patterns in animal body movements.


Previous efforts to develop AI tools that could track human motion have relied on large training sets of manually annotated data. That allowed the software to work robustly on diverse kinds of data, with vastly different backgrounds or lighting conditions.


The algorithms presented in this paper have potential to automate the labor-intensive part of our work more than has been possible previously, which could allow us to study a greater variety of animal locomotor behaviors.


Once they have a database of motion and behaviors, the neuroscientists on the team can draw connections to the neural processes behind them. This will allow researchers to not only gain a better understanding of how the brain produces behaviors.


Source


#Motion #Software #AI #Video

©2018 by Technoscious