Last Updated on
Computex is awash with new products, but Nvidia ACE may be one of the hottest bits of news to come from the event so far – particularly for gaming. The graphics (and more and more, AI) behemoth is looking to use generative AI to improve game experiences. But what is ACE, and when may we see it being used in games?
What is Nvidia ACE for Games?
Nvidia ACE, or Avatar Cloud Engine is literally a collection of models usable for creating non-playable characters using generative AI. For anyone unfamiliar with generative AI it is a use-case where software models’ algorithms generate apparently new outputs in response to prompts, based on training data. For example, ChatGPT will provide ‘new’ and natural responses in this way to questions asked of it. Similarly, DALL-E, Midjourney, and Stable Diffusion can create new images based on requests. There are four models to ACE:
- Nvidia NeMo – which focuses on character backstories and personalities
- Nvidia Riva – for automatic speech recognition (ASR) and text-to-speech (TTS)
- Nvidia Omniverse Audio2Face – instant generation of expressions and facial animations, triggered by audio sources
Similarly, Nvidia ACE’s models can create dynamic NPCs with which Nvidia suggests players can have “intelligent, unscripted and dynamic conversations”. Nvidia ACE is a custom service using “AI-powered natural language interactions”, and it has huge gaming implications. In announcing it, Nvidia’s John Spitzer stated, “Generative AI has the potential to revolutionize the interactivity players can have with game characters and dramatically increase immersion in games”.
He’s not wrong. The long and short of this is that games could feature realistic non-playable characters who interact with players in a natural and unscripted way, while also having personalities, facial animations, and expressions of their own. It’s exciting stuff, and it’s not too far away either.
We can only hope Nvidia ACE helps see the end of wooden NPCs – and adds more variety to “go and collect me… even more wolf pelts” dialogs of yore. In fact, we’ll see it in action in the not-too-distant future, as S.T.A.L.K.E.R 2 Heart of Chernobyl will make use of Audio2Face.
Nvidia Ace tools are being used by studios, as suggested by GSC Game World’s use of Audio2Face for Heart of Chernobyl, as well as in indie developer Fallen Leaf’s title, Fort Solis. In addition, although Nvidia has announced ACE, there’s still a limited early access program for developers wanting to work with Nvidia on using the models. It is invite-only, but developers can apply for access, so in theory it’s not completely restricted.
Nvidia ACE release date prediction
Given some studios are already working with Nvidia ACE, knowing when it will be fully released as a usable suite of models is a little bit tricky. Not forgetting, Nvidia will want to do due diligence on who has access to its generative AI tech. We’d expect the current models to be the initial ‘preview period’, meaning the Nvidia ACE full release date could be in December – which happens to be when S.T.A.L.K.E.R 2 is due to release.