Nvidia Shows Dynamic NPCs

Manuel Olumorin
May 31, 2023
GM GM. It’s Lootbox, your very own sidekick you may not need with the rate at which AI is going.
Today’s gems
Nvidia shows off dynamic NPCs
Building a gaming MVP with AI
The Sneakies - Apple joining the metaverse race?
Nvidia Shows Off Future Of Gaming
Nvidia, the latest Wall Street darling, just blew the roof off with their new demo of what the future of gaming could look like.
And it’s insane. Here's the demo…
So what’s this black magic?
The tech they are using is called Avatar Cloud Engine (ACE), a generative AI engine for game development.
What we know about ACE
The big idea here is making non-playable characters (NPCs) smarter with AI-powered conversations. No more robotic NPCs.
To make this happen, Nvidia incorporated NeMo, Riva, and Omniverse Audio2Face.
NeMo pulls the strings behind NPC backstories and personalities, allowing you to customize them. Think of it like a real-time scriptwriter for our NPC friends.
Riva makes live speech conversations with NPCs possible.
Omniverse Audio2Face generates facial animations from audio inputs, making the NPCs more lifelike.
And this demo, the Karios demo, was built in partnership with Convai, a Melbourne-based company that specializes in automated intelligent conversation systems.
It doesn’t look like it’s out yet, but you can subscribe to updates.
Lootbringer’s take
This new development affects the metaverse and gaming in some major ways. Here’s how we see it if this works perfectly.
The metaverse has been dealing with an emptiness problem, aka the cold start problem. This can help solve the problem by enabling the creation of human-feeling NPCs that are hard to distinguish from real humans.
Game development just got a lot cheaper and shifts the limitation for creating highly engaging experiences from development to storytelling.
We may see more infinite games like Eve Online but without the need for solely human interactions. NPCs will be able to actively influence the direction and history of the game.
Building A Game With AI
This builder created a 2.5D point-and-click style game prototype in 18 hours.
I used the new @Photoshop beta generative fill and @Blender to blend together multiple @midjourney generated #AIArtworks to create a 2.5D location for a point and click adventure game. 🧵 #buildinpublic#gamedev
1/8— Jussi Kemppainen (@JussiKemppainen)
May 30, 2023
Here’s how they did it
The prototype was started using Midjourney AI to whip up some character concepts and then Blender to turn them into 3D models.
For the scene, it was generated using Midjourney and edited with Photoshop Beta to extend and colour-grade the images.
Once the character and scene images were set, they used Blender and fSpy to reverse-engineer the camera data from the 2D image and create a 3D version of the location.
Then they used Unity to put all the pieces together, with a simple animator for the character's movements created to scale the 3D location to match the character and set up some lights to match the scene lighting.
You can get more details on how it was done from the devblog.
The Sneakies
Apple may be joining the metaverse race with its WWDC announcement themed “Code new worlds.” Even the image is hinting at the release of the rumoured VR headset. 👀

The giant communications company, T-mobile, becomes a validator on the Polygon network, helping to secure the network for both the Polygon PoS sidechain and Supernets.
Avalanche continues its growth with 2 new marketplaces coming soon, Superchief Gallery NFT and Zeroone.
Tweet of the Day
The next Axie, is Axie
— The Jiho.eth (@Jihoz_Axie)
May 30, 2023
Love what you just read?
Forward to a friend or share it on social media using the social media buttons at the top of this email.
Follow us on Twitter with the bird on the footer.
What do you think of today's edition? |