Google lets you test body tracking AI with "Move Mirror"
Google decided to share some of its more casual projects with the world and released a site where you can play around with one of its AIs (Artificial Intelligence). The "experiment" is called Move Mirror and uses your webcam and AI algorithms to compare the pose you are doing with a database of more than 80,000 images. Tracking is in real time and the "Mirror" switches between pictures multiple times a second, depending on how fast you are changing poses.
According to Google, the goal of Move Mirror is to highlight the capabilities of pose estimation using only a webcam and a computer. The company is hoping that after seeing how light-weight and easy to use the technology is, more developers will be willing to try it out and integrate it in their products.
The body-tracking part of the "mirror" is powered by PoseNet, a machine learning model that detects and tracks 17 points on a person's body, 12 joints and 5 points on the head. The data is then matched to the picture database and the result is displayed next to your webcam feed.
Now before you say it's all part of Google's master plan to learn everything about us, Move Mirror uses Tensorflow.js, with its operations contained only to the browser of your device. No data is leaving your computer or smartphone and being sent to servers. The model is open source, so if you're interested, you can peek inside to see how it's done.
We tested it and despite having a hard time detecting a wrist, it seemed to work as advertised and provided multiple minutes of entertainment. The site has a handy "Make a GIF" button, so you can share the goofiness with your friends.