- Posts
- 10,813
- Posts Power
- 10,813.0%
- Liked
- 890
- Joined
- Jan 2, 1996
- Website
- inviteshop.us
NVIDIA reveals ACE for Games to give NPCs ChatGPT-like chat features with matching animation
During today's keynote address at Computex in Taiwan, NVIDIA's $7 billion richer CEO Jensen Huang showed off a new technology for game developers. Naturally, the new tech involves using AI and cloud servers, which has become the company's big new focus.
The new technology is called NVIDIA ACE (Avatar Cloud Engine) for Games. It will allow developers to put in NPCs that can talk to player characters in real-time, but with dialogue that is non-scripted and powered by AI chatbots similar to ChatGPT, Bing Chat, and Bard. The technology also allows the NPC facial animations to match the non-scripted dialogue.
NVIDIA stated:
NVIDIA also stated that these AI NPCs are controlled by NeMo Guardrails, which will hopefully keep them from saying weird or even offensive things to gamers.
The company showed off a brief demo of ACE that was posted on YouTube. It was created in Unreal Engine 5 with ray tracing enabled and MetaHuman tech for NPC character models. NVIDIA also used technology from a startup company called Convai that's creating AI characters in games. NVIDIA added:
The AI NPC shown in the demo is definitely not perfect. His speech pattern seemed very stilted and, dare we say, artificial in nature. However, it's more than likely that speech patterns will be improved and become more natural in the months and years ahead.
NVIDIA did not state when ACE for Games will be available to game developers. However, it did mention that its Audio2Face technology, which matches the facial animation to a game character's speech, is being used in two upcoming games: the third-person sci-fi game Fort Solis, and the long-awaited post-apocalyptic FPS sequel S.T.A.L.K.E.R. 2: Heart of Chornobyl.
During today's keynote address at Computex in Taiwan, NVIDIA's $7 billion richer CEO Jensen Huang showed off a new technology for game developers. Naturally, the new tech involves using AI and cloud servers, which has become the company's big new focus.
The new technology is called NVIDIA ACE (Avatar Cloud Engine) for Games. It will allow developers to put in NPCs that can talk to player characters in real-time, but with dialogue that is non-scripted and powered by AI chatbots similar to ChatGPT, Bing Chat, and Bard. The technology also allows the NPC facial animations to match the non-scripted dialogue.
NVIDIA stated:
NVIDIA also stated that these AI NPCs are controlled by NeMo Guardrails, which will hopefully keep them from saying weird or even offensive things to gamers.
The company showed off a brief demo of ACE that was posted on YouTube. It was created in Unreal Engine 5 with ray tracing enabled and MetaHuman tech for NPC character models. NVIDIA also used technology from a startup company called Convai that's creating AI characters in games. NVIDIA added:
The AI NPC shown in the demo is definitely not perfect. His speech pattern seemed very stilted and, dare we say, artificial in nature. However, it's more than likely that speech patterns will be improved and become more natural in the months and years ahead.
NVIDIA did not state when ACE for Games will be available to game developers. However, it did mention that its Audio2Face technology, which matches the facial animation to a game character's speech, is being used in two upcoming games: the third-person sci-fi game Fort Solis, and the long-awaited post-apocalyptic FPS sequel S.T.A.L.K.E.R. 2: Heart of Chornobyl.

