Enter the fantastical realm of Skyrim, where a revolutionary new technology is about to transform the way you experience this iconic open-world RPG.
Nvidia has unveiled the Avatar Cloud Engine (ACE) for Games, which could bring a new level of intelligence and interactivity to the realm of dragons and adventure.
Imagine, with ACE for Games, the non-playable characters (NPCs) of Skyrim are no longer mere scripted characters but dynamic entities capable of engaging in natural language interactions and immersive conversations with players.
Imagine stepping into the shoes of the Dragonborn and engaging in lifelike dialogues with the inhabitants of Skyrim. With ACE for Games, the NPCs come alive, responding dynamically to your queries, displaying contextual awareness and embodying their own rich lore and backstory.
Whether you seek information, companionship or assistance on your epic quests, the NPCs of Skyrim now feel more realistic and interactive than ever before, making your journey through this mythical land an unforgettable adventure.
This all looks set to become reality with Nvidia ACE for Games, a custom AI model foundry service aimed at transforming the gaming industry announced at Computex 2023 by Nvidia CEO Jensen Huang.
By leveraging the power of generative AI, ACE for Games brings intelligence and natural language interactions to NPCs, promising offering players a new level of immersion and interactivity within virtual game worlds.
ACE for Games enables developers of middleware, tools and games to build and deploy customised speech, conversation and animation AI models in their software and games. With ACE, developers can create engaging in-game characters that respond to player queries and interact convincingly with other characters.
John Spitzer, vice president of developer and performance technology at Nvidia, said generative AI has the potential to “revolutionise the interactivity players can have with game characters and dramatically increase immersion in games”.
Leveraging Nvidia Omniverse as the foundation, ACE for Games provides optimised AI models for speech, conversation and character animation. The key components of ACE for Games include:
- Nvidia NeMo, which allows developers to build and customise language models using proprietary data;
- Nvidia Riva for automatic speech recognition and text-to-speech capabilities;
- Nvidia Omniverse Audio2Face for creating expressive facial animation that matches speech tracks.
NeMo allows developers to customise large language models with lore and character backstories while ensuring conversations remain safe and productive through NeMo Guardrails. Riva, meanwhile, enables live speech conversations by providing automatic speech recognition and text-to-speech functionality. Audio2Face, with its Unreal Engine 5 integration, allows developers to create lifelike facial animations that synchronise seamlessly with speech, enhancing the realism and immersion of NPCs.
During a collaboration with Convai, an Nvidia Inception start-up focused on conversational AI for virtual game worlds, Nvidia showcased the capabilities of ACE for Games through a demo called “Kairos”. In the demo, the player interacted with an NPC named Jin, who runs a ramen shop. Jin responded realistically to natural language queries, maintaining consistency with the narrative backstory thanks to the power of generative AI.
True marvel
While observing the single video released to showcase the conversation, it can be challenging to perceive the significant improvements over traditional NPC dialogue trees. However, the true marvel lies in the fact that the generative AI responds to natural speech, making it a remarkable achievement. It is hoped that Nvidia will make the demo available for public access, enabling individuals to experience it first hand and witness the potential for diverse and unexpected outcomes.
The neural networks powering Nvidia ACE for Games are optimised for various capabilities, offering trade-offs between file size, performance and quality. Developers can fine-tune the models for their specific games and deploy them in real time using Nvidia DGX Cloud, GeForce RTX PCs or on-premises systems.
GSC Game World, one of Europe’s top game developers, plans to adopt Audio2Face in its highly anticipated game, S.T.A.L.K.E.R. 2: Heart of Chernobyl. Indie game developer Fallen Leaf is utilising Audio2Face for character facial animation in its upcoming third-person sci-fi thriller, Fort Solis, set on Mars. Charisma.ai, a company specialising in virtual characters powered by AI, is leveraging Audio2Face to enhance the animation in its conversation engine. – © 2023 NewsCentral Media