Microsoft released a new beta version of the company's Swiftkey keyboard application for Android recently. The new beta release comes with a new feature called Puppets.
Puppets works similarly to the Animoji feature of Apple's iOS operating system. It gives Swiftkey users an option to create virtual character (puppets) clips based on the recorded facial expressions of the person that is in front of the camera.
Microsoft introduced the feature in Swiftkey Beta for Android. The feature will roll out to all users once the beta hits final but for now, it is limited to users of the beta version of the keyboard application.
Interested users may download the beta APK from third-party sites like Softpedia. Note that beta versions should not be installed on production devices. An attempt to run Swiftkey Beta on a Google Pixel 3A resulted in a crash of the application, it ran fine on another Android device however.
Here is a video by Microsoft that demonstrates the feature.
Swiftkey is a keyboard application for Android which means that it may be used in any application on the device that supports keyboard input. The Puppets feature works in any messaging application provided that it supports the sharing of video files.
The first version of Puppets comes with five different avatars that Swiftkey users may select when they choose to create a new animation. The characters in question are a dinosaur, panda, cat, owl, and a dog.
Puppets works by selecting the option in the Swiftkey application and recording once own facial expressions which the application uses to animate the selected avatar.
The created animation may then be shared using built-in sharing functionality.
Microsoft's Swiftkey team is especially proud of the fact that its solution relies on RGB cameras and not on cameras with in-built depth sensors. The fact reduces the requirements to create Puppets and ensures that the feature can be used on nearly any Android device out there. Puppets is available to all Android N and newer devices.
SwiftKey worked with the Microsoft Computer Vision and Microsoft Research Asia teams to bring Puppets to life. Unlike other facial tracking software, SwiftKey’s Puppets does not rely on users having a device with an inbuilt depth sensor in their camera and instead uses an RGB camera found in most Android smartphones.
Puppet's algorithm was trained using "thousands of volunteers from around the world" according to Swiftkey to train a Deep Neural Network to "learn how to identify facial movements and transfer these onto an expressive animal character ".
Puppets algorithm worked surprisingly well during tests. While you should not expect that the algorithm mimics all facial details it does a good job at mimicking expressions. Android users who like to attach animated gifs, videos, smileys, emojis, and other visuals to their messages will probably like this feature as well.
Now You: What is your take on Puppets? Gimmick or something useful?Advertisement
Advertising revenue is falling fast across the Internet, and independently-run sites like Ghacks are hit hardest by it. The advertising model in its current form is coming to an end, and we have to find other ways to continue operating this site.
We are committed to keeping our content free and independent, which means no paywalls, no sponsored posts, no annoying ad formats or subscription fees.
If you like our content, and would like to help, please consider making a contribution:
Ghacks is a technology news blog that was founded in 2005 by Martin Brinkmann. It has since then become one of the most popular tech news sites on the Internet with five authors and regular contributions from freelance writers.