Robots Learn to Track Human Body Language

Published by , July 24, 2017 12:45 pm

(SpectrumIEEE)  Researchers at Carnegie Mellon University developed a body-tracking system called OpenPose that can track body movement, including hands and face, in real time. It uses computer vision and machine learning to process video frames, and can even keep track of multiple people simultaneously. This capability could ease human-robot interactions and pave the way for more interactive virtual and augmented reality as well as intuitive user interfaces.
One notable feature of the OpenPose system is that it can track not only a person’s head, torso, and limbs but also individual fingers. This technology could be applied to all sorts of interactions between humans and machines. It could play a huge role in VR experiences, allowing finer detection of the user’s physical movement without any additional hardware, like stick-on sensors or gloves.

Tags: ,