Using Apple’s ARKit and iPhone’s TrueDepth Camera
A recently released iOS app from Epic Games allows developers to record facial expressions that can be imported directly to the Unreal Engine, using little more than an iPhone’s front camera. The application Live Link Face, which is available to download now from the App Store, can transmit this data facial animation in real-time directly to the characters in Unreal Engine that Epic hopes to make the facial capture is “easier and more accessible “to creators.
The Epic app works based on a couple of existing Apple technologies, including its ARKit augmented reality platform and the TrueDepth camera that Apple introduced with the iPhone X in 2017. It’s the same technology that powers Apple’s Animoji and Memoji, which they map their facial expressions in cartoon avatars. Only now can it be used to cheer up avatars in the engine that powers many of the world’s most popular games.
Epic is not the first company to think of using Apple technology as an animation tool. It didn’t take long for developers to start making facial capture apps after the iPhone X launched, and we also saw that it used to generate facial expressions for a Walking Dead augmented reality game in 2018. But having the functionality built directly into the Unreal Engine, which is used by millions of developers around the world, could give it a much wider scope and make it easier for artists to use it in their work.
Epic says its Live Link Face app can work in situations ranging from an artist’s home office to a multi-actor soundstage in motion capture suits with head-mounted gear, and that it can be natively adjusted between them. The app can be controlled remotely, so Epic says you can configure it to start recording simultaneously on multiple iPhones with a single command. Once imported, the facial animation data can be adjusted on the engine.
If you are a developer who is interested in testing the application for yourself, the Epic documentation for the feature can be found here.