At Unreal Fest 2024, NVIDIA released new Unreal Engine 5 on-device plugins for NVIDIA ACE, making it easier to build and deploy AI-powered MetaHuman characters on Windows PCs. ACE is a suite of digital human technologies that provide speech, intelligence, and animation powered by generative AI.
Developers can also now access a new Audio2Face-3D plugin for AI-powered facial animations in Autodesk Maya. This plugin provides a simple, streamlined interface to help develop avatars in Maya easier and faster. The plugin comes with source code so that you, as a developer, can dive in and develop a plugin for the digital content creation (DCC) tool of your choice.
NVIDIA has also built an Unreal Engine 5 renderer microservice that leverages Epic’s Unreal Pixel Streaming technology. This microservice now supports the NVIDIA ACE Animation Graph Microservice and Linux operating system in early access. The Animation Graph Microservice enables realistic and responsive character movements, and with Unreal Pixel Streaming support, you can stream your MetaHuman creations to any device.
The NVIDIA ACE Unreal Engine 5 sample project serves as a guide for developers looking to integrate ACE into their games and applications. This sample project expands the number of on-device ACE plugins, including:
- Audio2Face-3D for lip sync and facial animation
- The Nemotron Mini 4B Instruct model for response generation
- Retrieval-augmented generation (RAG) for contextual information
As a developer, you can build a database for your intellectual property, generate relevant responses at low latency, and have those responses drive corresponding MetaHuman facial animations seamlessly in Unreal Engine 5. Each of these microservices are optimized to run on Windows PCs with low latency and a minimal memory footprint.
These new plugins are coming soon. To get started, ensure you have the appropriate ACE plugin and Unreal Engine sample downloaded alongside a MetaHuman character.
Streamline 3D animation with the Maya ACE plugin
Autodesk Maya offers high-performance animation functions for game developers and technical artists to create high-quality 3D graphics. Now you can generate high-quality, audio-driven facial animation for any character more easily with the Audio2Face-3D plugin. The streamlined user interface enables you to seamlessly transition to the Unreal Engine 5 environment. The source code and scripts are highly customizable and can be modified for use in other digital content creation tools.
To get started, generate an API key or download the Audio2Face-3D microservice. The Audio2Face-3D microservice is part of NVIDIA NIM, a set of easy-to-use microservices that speed up the deployment of foundation models on any cloud or data center.
Next, ensure you have Autodesk Maya 2023, 2024, or 2025. Access the NVIDIA/Maya-ACE GitHub repo, which includes the Maya plugin, gRPC client libraries, test assets, and a sample scene—everything you need to explore, learn, and innovate with Audio2Face-3D.
Scale digital human technology deployment with UE5 Pixel Streaming
When deploying digital human technology through the cloud, the goal is to simultaneously reach as many customers as possible. Streaming high-fidelity characters requires significant compute resources. The latest Unreal Engine 5 renderer microservice in NVIDIA ACE adds support for the NVIDIA Animation Graph Microservice and Linux operating systems in early access.
Animation Graph is a microservice that interacts with other AI models to create a conversational pipeline for characters. It’s responsible for connecting developer RAG architectures, maintaining both context and conversational history. With the new UE5 pixel streaming compatibility, you can run a MetaHuman character on a server in the cloud and stream its rendered frames and audio to any browser and edge device over Web Real-Time Communication (WebRTC).
Get started
To get started with the Unreal Engine 5 renderer microservice, apply for early access. Learn more about NVIDIA ACE and download the NIM microservices to begin building game characters powered by generative AI.