AI is coming to games, whether you like it or not. Last night’s Nvidia keynote showed just how powerful—and devastating—that’s going to be. The company’s CEO, Jensen Huang, showed off how its freshly announced “Omniverse Avatar Cloud Engine” (ACE) can create real-time interactive AI NPCs, complete with improvised voiced dialogue and facial animation.

While the focus of AI’s incursion into gaming spaces has perhaps so far been mostly on the effects for artists, it’s writers who should already have the most to fear. Given how mediocre the standards are for NPC dialogue and quest texts in games, it’s absolutely inevitable that the majority of such content will be AI-written in the near future, despite the potential fury and protests that will come in its wake. But Nvidia’s reveal last night suggests that the consequences could be far farther-reaching, soon replacing voice actors, animators, lighting teams, the lot.

ACE is Nvidia’s “suite of real-time solutions” for in-game avatars, using AI to create characters who can respond to unique player interaction, in character, voiced, and with facial expressions and lip-syncing to match. To see it in action (or at least, in purported action—we’ve no way of verifying the footage the company played during the Computex 2023 keynote), take a look at this. It should start at 25 minutes, the clip starting at 27:

NVIDIA Taiwan

So what you’re seeing here is an in-game character responding in real-time to words the player says out loud, uniquely to how they phrased the questions, with bespoke dialogue and animation. The character has a backstory, and a mission it’s compelled to impart, but beyond that the rest is “improvisation,” based on the words the player says to it.

This is the most immediately obvious use of ChatGPT-like AI as we currently understand it, which is essentially a predictive text model writ large. It’s ideal for creating characters able to say coherent, relevant conversational dialogue, based on inputs.

Now, there are two very obvious issues to mention straight away, the first being how awful and flat the character’s performance is in this clip. But remember, this is the first iteration of this tech, and then put it in the context of how, until about ten minutes ago, computer-generated voices all sounded like Stephen Hawking. This’ll advance fast, as AI models better learn to simulate the finer nuances of human speech.

The second issue is that absolutely no one playing a game like this would stick to the script as happens in this clip. In fact, the first thing just about everyone would say to such an NPC would be something about fucking. For reference, see all text adventure players ever in the early 1980s. That’s going to be the more difficult aspect for games to overcome.

Screenshot: Nvidia / YouTube / Kotaku

Of course, application of the tech is going to be viewed as far less important in the face of just how many jobs ACE is looking to replace. Huang so nonchalantly mentions how the AI is not only providing the words and voice, but is doing the animation too. And this is in the wake of his previously explaining how AI is being used to generate the lighting in the scene, and indeed improve the processing power of the graphics technology that’s creating it all.

There’s no version of reality where this doesn’t see a huge number of people in games development losing jobs—albeit most likely those who haven’t gotten said jobs yet. Why hire new animators for your project when the AI will do it for you, backed up by the dwindling team you’ve already got? Who’s going to look for new lighting experts when there’s a lighting expert living inside your software? Let alone the writers who currently generate all the dialogue you currently skip past.

And this isn’t futuristic stuff to concern ourselves with somewhere down the line: it already exists, and it’s going to be appearing in games that release this year. With the announcement of ACE, this is all going to be exacerbated a lot faster than perhaps anyone was expecting.

For game studios, this is great news! The potential for such technology is incredible. Games that are currently only achievable by teams of hundreds will become realistically achieved by teams of 10s, even individuals. We, as players, will soon be playing games where we can genuinely roleplay, talk directly to in-game characters in ways the likes of Douglas Adams fantasized about and failed to achieve forty years ago.

But when it comes to specialist jobs in the industry, it’s going to be carnage. And this will happen, as certainly as automated textile equipment makes all our clothes.

 

John Walker

Source link

You May Also Like

Diablo 4 error codes including 315306 and what they mean

Diablo 4 error codes are not something you want to see as…

Outcast A New Beginning Free Download – World Of PC Games

Ads – Download Now Direct Download Outcast A New Beginning Direct Download:…

All 19 million of you Palworld players are racking up some very high server costs

It turns out that Palworld being popular enough to bring in…

From MOBA to Action-Platformer – How Convergence: A League of Legends Adapted Zaun’s Champions – Xbox Wire

My name is Eric Angelillo, I’m the Creative Director and Co-Art Director…