This article is part of a VB special issue. Read the full series here: Data centers in 2023: How to do more with less.

The metaverse was once pure science fiction, an idea of a sprawling online universe born 30 years ago in Neal Stephenon‘s Snow Crash novel. But now it’s gone through a rebirth as a realistic destination for many industries. And so I asked some people how the metaverse will change data centers in the future.

First, it helps to reach an understanding of what the metaverse will be. Some see the metaverse as the next version of the internet, or the spatial web, or the 3D web, with a 3D animated foundation that resembles sci-fi movies like Steven Spielberg’s Ready Player One.

In the last few years, the metaverse went through a hype cycle, spurred by Mark Zuckerberg’s decision to rename Facebook as Meta in a bid to make virtual reality and mixed reality headsets into the windows of the metaverse. Others see it extending far beyond that to smartphones, PCs and just about any gadget.

While games such as World of Warcraft and virtual worlds like Second Life have taken a step in the direction of the metaverse, others say that Fortnite, Minecraft and Roblox are the true forerunners of the metaverse with their emphasis on user-generated content and daily online multiplayer gaming. And still, more believe it will be something much bigger than that.

Event

GamesBeat Summit 2023

Join the GamesBeat community in Los Angeles this May 22-23. You’ll hear from the brightest minds within the gaming industry to share their updates on the latest developments.


Register Here

Matthew Ball, author of The Metaverse: And How It Will Revolutionize Everything, refers to it as a persistent and interconnected network of 3D virtual worlds that will eventually serve as the gateway to most online experiences, and also underpin much of the physical world.

“When they’re having discussions about the metaverse, people always focus on upper levels of the stack,” said Rev Lebaredian, VP of Omniverse and Simulation at Nvidia, in an interview with VentureBeat. “There has been almost no discussion about what the infrastructure underneath is, and we care a lot about that. We’re building that stuff.”

More precisely, he said it is a “massively scaled and interoperable network of real-time rendered 3D virtual worlds that can be experienced synchronously and persistently by an effectively unlimited number of users with an individual sense of presence, and with continuity of data, such as identity, history, entitlements, objects, communications and payments.”

I think of it as a real-time internet where we can have experiences like in the Star Trek holodeck, where you can immerse yourself in a world that can be indistinguishable from reality. And you should be able to use this metaverse to switch to another such world instantaneously.

When you get to that kind of definition of the metaverse, it’s clear that it isn’t here yet. And the question becomes: How long will it take to get there, and will technologists ever be able to build it? Raja Koduri, the chief architect at Intel, predicted in 2021 that the metaverse will need 1,000 times more computing power than was available back then. That’s obviously going to take a while.

Lower down the stack

How Roblox sees the tech stack changing for the metaverse

Lebaredian said that there has been a lot of discussion about the applications that everyone wants to see for the metaverse. But he noted there has been very little discussion of the technology lower down in the stack for the data centers. He wants to see more attention to what it will take to really build the metaverse and the data centers that support it.

He said, “I believe that the infrastructure that we need to build out is going to be different for the metaverse as a whole compared to the data centers we have today. There are going to be differences even within these two classes of the metaverse, the consumer one and the industrial metaverse.”

As soon as I asked Jon Peddie, president of Jon Peddie Research and author of a three-book series dubbed The History of the GPU, he answered with a question about what the metaverse is. He believes it will be a single entity that ties together all of the networks — something like today’s internet, but evolved. And he said it will be a good question about where it actually resides.

“Should a metaverse ever be created, where will it be? It will be everywhere and nowhere,” Peddie said in an email to VentureBeat. “It won’t have a central location at NSA’s headquarters or Google. One of the basic tenets of a metaverse is that all tokens from any subverse will be frictionlessly exchangeable with any other subverse via the metaverse.”

He added, “If I buy a dress in Ubisoft’s subverse and sell it in Nvidia’s Omniverse, the tokens exchanged (Bitcoin or Euros) will flow to my digital wallet without me having to do anything more than let the computer look at my beautiful blue eyes. I may or may not be wearing a suffocating VR headset, and I may or may not be almost fainting at the enthralling aspects of blockchain transactions on Web3 or Web4, it will just happen — that is a metaverse, and for that to work, it has to be on every machine, much like a browser is on your laptop, tablet, TV, smartphone and car. So don’t look for a zip code for the metaverse.”

Omid Rahmat is an analyst who works with Peddie and he will be a speaker at our GamesBeat Summit 2023 event. He noted that the notion of digital twins, or building a copy of something in the real world in order to simulate something — like a BMW factory — could be a big part of the metaverse.

Under that notion, he said, “the metaverse is the total sum of digital data that mirrors the real world as well as extending into new ones that connect to this one.” He also believes that the revolution in generative AI will also lead to a vast expansion in conversational man-machine interfaces.

Rahmat thinks that “younger generations are going to be happy to move to conversational man-machine interfaces because they’re probably going to be fed up [with] getting neck-aches constantly looking down at their phones.”

He said these younger generations are going to then be more amenable to the use of heads-up displays because they are mobile, can manage their digital environments with conversational AI, and will probably demand mixed-reality experiences.

“All of these assumptions will extend the amount of data and computational demands placed on data centers because no matter how powerful mobiles become, they will never be powerful enough to handle the vast amounts of computing power needed to support this sea change in user behavior,” Rahmat said.

“Into this mix, you add the vast amount of data that is going to be used when we move to the internet of sensors, a natural extension of our need to model, simulate, measure and interact with the metaverse to mitigate costly behaviors in the real world,” he added. “Just the sheer volume of data and servers that will be needed to enable a semi-autonomous automotive experience, not even [fully] autonomous, is beyond existing datacenter capacities.”

This isn’t just about fun and avatars, either.

“The companies that want to control the metaverse, the big tech giants that are investing in all of the above, will need to own vast amounts of data and will have to devote ever more resources to adapt that information into viable products for businesses and consumers. All of this is happening, today, but we’ve only scratched the surface of demand,” Rahmat said.

He believes the metaverse is ultimately going to be a reinvention of our shared reality, a way to create a digital transformation of real-world interactions. That means we are going to need to create infrastructure for a 10 billion-user client-server model by 2050. We are nowhere near having the resources in place to support that kind of expansion, and are only at the beginning of the road to finding more energy-efficient, recyclable approaches to building out the infrastructure, he said.

These are different strategic views of the metaverse that I’ve come across, and there are counter-arguments being made by folks who want the metaverse to be open and decentralized. We’ll see what others foresee as well over the course of this story.

Roblox’s view of data centers of the future

Roblox shows how text prompts can change a car instantly.
Roblox shows how text prompts can change a car instantly.

Roblox prefers to run its own data centers, with only a small amount [of data] handled by outsiders, said Dan Sturman, CTO of the company, which has 67 million daily active users. As such, it’s one of the leading companies on the metaverse today, with its focus on user-generated content.

That allows the company to save money and give more money back to creators. For the metaverse, Sturman sees changes ahead for data centers.

To enable that, the company has to create custom solutions that deliver the functions that Roblox and its developers need. It also has to deploy data centers around the world while respecting local networking requirements and national restrictions on storing private user data. Roblox keeps its data in its core data centers and is also pushing a lot of processing out to the edge of the network.

Gamers who have good computers can do a lot more of the processing required for running Roblox on the computers at the edge of the network. But those who have older computers do less of that work at the edge. In that sense, Roblox takes advantage of infrastructure at the edge.

“Pushing compute to the edge is even more important. The frames per second for interactivity is just a whole other level compared to what we’ve had on traditional web apps,” Sturman said. “If you start looking at it, we want to do at least 30 frames a second. Interactivity is really important. We share the load with the client devices.”

By contrast, with virtual reality, much of the computing load is handled by a standalone VR device.

“One thing we’re learning is we need to be ready to kind of shift [computing] work based on the client device we’re talking to,” Sturman said. “That’s the direction we’re heading. And I think it’s important. So that takes me into GPUs because with most devices out there today, most graphics can be done on the end device.”

For voice processing, Roblox does a lot of that in the cloud using GPUs. Roblox is also doing a lot more machine learning inference processing with more algorithms running. Some of that happens on the CPU, but GPUs are also likely to be used in the data center for that purpose.

“Interpreting voice into our facial expressions is something we want to do at the edge, not at the core data center,” Sturman said. “We want to take what you’re saying and put that on your avatar. So your lips move accordingly. I think all good data center design comes down to total cost of ownership. What is my workload? And how do I assemble the tech available to execute that workload efficiently as possible?”

Roblox is also exploring large experiences, like rock concerts with tens of thousands of people. Those could benefit from advances in networking technology like Nvidia’s Mallanox technology. Sturman thinks it would be “incredible” to do a 50,000-person concert. But that’s likely to require changes in both software and hardware architecture, he said. It’s hard to imagine that networking between servers will ever be faster than the memory bus within a server, he said. But it’s worth looking at.

Sturman said his company uses the Lua programming language because it makes it easy to run an app on any device. And it has to run anywhere in the world. To make that happen and build the data centers for all of that, it takes a lot of focus on the game engine, data centers and infrastructure support.

“It doesn’t just happen by itself,” Sturman said.

Generative AI will be a revolution for many industries, and in the case of gaming it will lead to better user-generated content. Creators will be able to craft things much faster and with less help. Roblox has already launched a coding assist feature with generative AI on Roblox. Over time, it could lead to a lot more user-generated content, and, as a result, the need for more data center infrastructure.

Pushing the problem to the cloud

Matthew Ball explains the significance of the metaverse.
Matthew Ball explains the significance of the metaverse.

Lisa Orlandi, CEO of 8agora, said in a message to VentureBeat that we’ll see an early push of metaverse processing and applications into the cloud.

“If you look at metaverse companies today, the heavy compute requirements and rendering are downloaded onto the user device, but this model does not scale well to billions of people across the globe,” Orlandi said. “This will need to be pushed to a multicloud infrastructure (similar to what Amazon or Netflix are doing to stream in the cloud). This also means that data centers will need to ramp up their compute capacity and continue to build out their infrastructure to support the high-speed, high-compute requirements.”

But she noted it will be a challenge to do this kind of processing in the cloud in a sustainable way as power consumption for these heavy compute-intensive environments and bandwidth requirements to the user will increase significantly.

“Even when you look at Nvidia’s streaming, where they render in the cloud, the user still needs to download an app (it’s not web-based),” Orlandi said. “They don’t support bidirectional audio and the bandwidth requirements are high. For instance, GeForce Now requires at least 15Mbps for 720p at 60FPS.”

High density of avatars in Yuga Labs' 2nd Trip in Otherside.
The high density of avatars in Yuga Labs’ 2nd Trip in Otherside

That goes up to 25Mbps for 1080p at 60FPS and 35Mbps for streaming up to 2560×144/2560×1600/3480×1800 at 120FPS. These higher bandwidths equate to a higher power requirement and higher cost to the consumer and will require data centers to increase their capacity exponentially while maintaining sustainability, Orlandi said.

“In Europe, they’ve adopted green energy requirements in their data centers and we believe that this will soon be the case in the U.S. as well,” she added. “This means that new technologies will need to be implemented that can scale to billions of people across the globe. It won’t be just a matter of lowering the component power, but a new strategy to enable an end-to-end multicloud solution that can handle the increase in users, lower their cost and power footprint, and also lower the cost and power footprint for the data centers.”

This is really the problem that 8agora focused on, which is moving the client app to the cloud to integrate it with the streaming app and thereby allowing the use of green energy data centers and high-quality rendering that can scale across any use case, Orlandi said.

“Multiple sessions can be rendered (20) with a single GPU card rated at 70 watts while encapsulating the audio/video data stream back to the user down to 1Mbps,” Orlandi said. “This allows the optimizations needed by the data centers to build out and scale high-performance environments at low power. Because bandwidth requirements are lowered, this means they can support much higher capacity across a multicloud infrastructure in a sustainable way across billions of people.”

The industrial metaverse will drive datacenters

BMW Group is using Omniverse to build a digital factory that will mirror a real-world place.
BMW Group is using Nvidia’s Omniverse to build a digital twin factory that will mirror a real-world place.

The thing about the metaverse, as Peddie noted, is that processing will take place in the cloud for some applications, like real-time games with massive numbers of players. But much of the processing will also take place at the edge, Nvidia’s Lebaredian said. You may need to access the metaverse with your smartphone if you’re at a location where you can capture data on the scene but not have access to a supercomputer.

Some people are legitimately wondering if the metaverse craze, rising out of the pandemic when we were forced to communicate digitally, has waned as the hype cycle has moved on to AI and as the effects of the pandemic have lessened and enabled more people to go out publicly.

It’s natural for some of the interest on the consumer side to subside, as mixed reality technology is still a long way from fruition as a consumer product. But Nvidia sees a huge amount of metaverse activity on the industrial and enterprise side, said Lebaredian.

“On the industrial side, the parts we’ve been focused on, the metaverse is alive and kicking. And everybody wants it from all the customers that we’re working with,” Lebaredian said. “The enterprise is more like the lead horse of the metaverse.”

The industrial metaverse is a business-to-business ecosystem. The parts of the metaverse, or virtual environments, that connect back to the real world are where Nvidia is focused. That means things like digital twins, where a company designs a factory and makes it perfect in the digital world before it builds the real thing in the physical world. And it will outfit that physical factory with sensors that can feed data back to the digital twin so the company can have a data loop that improves the design over time.

It follows that enterprise data centers are going to be the ones that will evolve to serve the customers of the metaverse. New technologies like the metaverse will start out expensive — note the $3,300 cost of the Magic Leap 2 mixed reality headset — and only enterprises will be able to afford it.

“When you get to the market, you have to have the perfect confluence of conditions” like low costs and seamless user experiences, Lebaredian said. “In industry, you have less of those constraints. If you simulate things that help you design products in the metaverse for things that cost billions of dollars and then can save you millions of dollars, then your price sensitivity is different. We are building systems that let you scale at high fidelity with extreme scale.”

The result is likely to be that data centers will adapt to meet the needs of enterprise metaverses first, Lebaredian said. We will likely see an explosion of technologies to serve the metaverse and its infrastructure, just like search engines and accompanying businesses like Akamai served the needs of the fledgling internet.

“Once people figured out a business model around the internet, then that’s what it took to make the internet really grow,” he said. “Then look what happened. Google made search work. They built a business model around it. We’ve seen this before.”

Lebaredian isn’t sure the metaverse term itself will stick. He remembered how Al Gore referred to the internet as the information superhighway, but that buzzword didn’t last. But he thinks the technology itself will absolutely be necessary and useful in the long run.

“Somebody has to keep the ball moving forward, and it makes sense that Nvidia would be one of those companies investing in this particular set of technologies,” Lebaredian said. “We’ve done computer graphics. We’ve done gaming. We continue to do supercomputing, AI — all of this stuff. It all comes together right here in the metaverse.”

The hardware underneath

Nvidia OVX SuperPods can drive the metaverse.
Nvidia OVX SuperPods can drive the metaverse.

Right now, Nvidia’s lead system for datacenters running metaverse applications is the Nvidia OVX system, which is a SuperPod architecture that offers scalable performance for operating real-time simulations and AI-enabled digital twins on a factory, city or planetary scale.

“OVX systems are designed for the industrial metaverse for digital twins and designed to scale into data centers, networking, low latency and high bandwidth,” Lebaredian said. “That is the foundation we are building Omniverse Cloud on. And Microsoft Azure is about to stand up a whole bunch of OVX systems.”

BMW demonstrated how factory planners from around the world can come together in a digital twin — a factory that is ready in the virtual sense now and will be built physically in 2025 — and walk through it together virtually to figure out what is right or wrong about the design. In Japan, this is known as a “gemba walk.” Those people have to see what the others are modifying in real time as they interact with a factory that has something like 20,000 robots. During the walk, they can make agile decisions.

“Getting that factory to simulate in real time, that’s a major challenge,” said Lebaredian. “Gaming systems like a PlayStation can’t do that. But the OVX has GPUs, CPUs and enough memory to handle something like that. The physical factory for BMW will be in Hungary and be miles long. It’s so big the curvature of the earth matters in the design. But it will exist as a simulation in the Omniverse.”

BMW and Nvidia have to make thousands of GPUs available to run that digital-twin simulation. That’s essentially going to be running in a Microsoft Azure datacenter. With this infrastructure in place, an engineer can make a change in one part of the factory and it can immediately be visible to everyone.

The problems that enterprise designers run into — and the need for access to massive amounts of real-time data — have some parallels in the game world. You could make a game that is so realistic that a building can collapse and produce a pile of rubble. That rubble has to be calculated with care since there are so many pieces of data that must be accessed in real time by different players in the game. If one player’s PC at the edge calculates the rubble faster than another’s, the rubble will look different to different players based on how fast their machines are.

That doesn’t work. But if you put the game in the cloud and do all of the calculations in hardware inside a data center, then the calculations can be done quickly and shared among all of the users whose game data is in the data center itself.

“If we change it up, instead of doing computation at the edge, so that it all happens in the same data center, like with GeForce Now, we can ensure almost zero latency between the players,” Lebaredian said.

The sniper and the metaverse

Let's hope that sniper can't see that far in Fortnite.
Let’s hope that sniper can’t see that far in Fortnite.

Another problem was summarized as the “sniper and the metaverse.” Most players in a multiplayer game are limited to participating in a single server, with maybe something like a maximum of 100 players in a server. But if you’re a sniper on high ground, you might be able to see beyond the borders of a single server. You might see another soldier that you can snipe a mile or two away. But if the bullet crosses a server border, then it might slow down and the simulation might be out of sync. And so most of the time, the game makers have to limit such games so the sniper can’t see that far.

Inside the data center, one of the key differences compared to the edge is the speed of networking. Nvidia’s Mellanox acquisition enabled it to acquire networking for supercomputers. That is so fast and the latency is so low that communication can be faster between different nodes on a network than it is within nodes. That means server-to-server communications can be faster than communication within a server.

“When that happens, it becomes blurry what is the actual computer, and there is no limit to the scaling that you can do,” Lebaredian said. “The problem of snipers and distance goes away.”

That is, when the networking is so fast within the data center, the notion of shards — or different servers with distinct borders between them — can go away.

“Supercomputing it’s all about how fast you can move data between the nodes. Because once you have that interconnect, it’s actually kind of blurry what is the computer. What we’re doing at Nvidia in general is that we are democratizing supercomputing,” Lebaredian said.

The same problem with industrial robots

BMW is building a digital twin of a factory that will open for real in 2025.
BMW is building a digital twin of a factory that will open for real in 2025.

It turns out the sniper and the metaverse are not so different from the robot and the metaverse.

“In a factory, there are thousands and thousands of robots, and they all are doing their own thing, and they need to run their simulations locally,” said Lebaredian. “And it needs to be done in one unified space. As far as the infrastructure to do this goes, one of the things we believe is a key technology [is] to allow more players to be in the same space and be more interactive together in a physically consistent way.”

He added, “The big problem you have on the internet is not bandwidth. It’s latency. If I have a 200-millisecond round trip time from where I am to the game server, and yours is only 10 milliseconds round trip time, then the world you and I are experiencing is slightly different. If we are in a shooter, and you’re going to shoot me and I duck behind a wall, then what you see when you are targeting me is not in actual physical time. I’m not behind the wall yet.”

Generally, in that situation, the players see different things but the game usually favors the shooter. But the person who is shot probably thinks there was something wrong with the timing. If the calculation happens inside the data center, via cloud gaming, then that calculation of both player locations can be done at the same time by the same computer, resulting in a better adjudication on whether one player shot the other or the other player hid behind a wall.

“Once we move all of this computing to the cloud, a lot of those problems become easier,” Lebaredian said. “You can do a lot of computation in a distributed manner, across different nodes and computers. But between those computers, you have enough bandwidth, and the stability of connections and latency is low enough. And, the actual players can be in different parts of the world. They’re just, they’re just further from the computer doing the calculation. They experienced the latency of the video and the clicking, but not the massive problem of distributing the simulation across the globe. That’s what we’re essentially doing right now when you play Fortnite.”

Earth 2 and The Lord of the Rings

Nvidia’s big project for the leading edge of supercomputers is Earth 2. Jensen Huang, CEO of Nvidia, said that Earth 2 is a simulation, a digital twin, of the entire planet. Nvidia will simulate all of the weather formations on Earth and make it accurate to a meter level. If it can pull this off, using all the supercomputers of the world, then it could predict climate change for decades to come.

But the only way to tackle that is to break the problem down into local parts, like calculating the weather in one region and then passing along the impacts to other regions through fast interconnections such as the Mellanox technology. If the network is really slow, and the network becomes a bottleneck, then the computing can be subdivided on a local level with an attempt to get the processing done in one region.

Supercomputers concentrate on computing the fluid dynamics of water in the clouds and the environment and how they change second to second.

Nvidia's Earth 2 simulation.
Nvidia’s Earth 2 simulation

“You spatially subdivide the problem and chop it up so each computing unit can communicate on an island of its own and communicate quickly when it needs to do that,” he said.

Some of this is not so different from what occurs in a giant Lord of the Rings battle, like the Ride of the Rohirrim in the Battle of the Pelennor Fields in Peter Jackson’s The Lord of the Rings: Return of the King film. If you tried to replicate this battle in 3D in the metaverse, it would be a huge undertaking.

But rather than try to render it all at once in edge-based player machines, you could do it in the cloud and subdivide the problem. A supercomputer could handle a chunk of the battle where the Orcs and Riders are swirling around in combat.

Getting tens of thousands of humans and Orcs into the same scene isn’t so different from getting tens of thousands of robots into an industrial factory, as far as computing problems go, Lebaredian said.

“Instead of calculating molecules in fluid simulation for water, it’s simulating humans and Orcs,” Lebaredian said.

Will the metaverse be decentralized?

Project
Arrivant is unveiling its blockchain gaming franchise Project: Eluüne: StarGarden. The science fiction title is a tribe-based, socially focused creature crafting and battling game.

Not every metaverse application is going to require massive centralized cloud computers. In fact, much of the blockchain gaming world may want the computation to happen in a decentralized way via blockchain and Web3 technologies.

Ultimately, as the metaverse grows, assuming it grows in a decentralized fashion, its dependence on centralized hosting and data centers should decrease as it is pushed across more peer-to-peer networks enabled by blockchain-based ecosystems, said Jason Brink, president of blockchain at game maker Gala Games, in an email to GamesBeat.

“At its core, ‘metaverse’ is indistinguishable from ‘gaming’ in form, if not function. Decentralized hardware, hosting and image processing can revolutionize the gaming industry,” Brink said. “By leveraging the power of distributed networks and edge computing, these technologies can bring about transformative changes in performance, accessibility and innovation within the gaming landscape. This goal of driving innovation and decentralization is central to what we do at Gala Games.”

One of the most significant advantages of decentralized hardware and hosting is the enhanced performance it offers for both gaming and metaverse applications, he said.

Traditional gaming relies heavily on centralized servers and high-powered personal computers or consoles. However, decentralized systems utilize a vast network of devices, distributing the computational load across multiple nodes, Brink said.

The nodes of a computing network could be run by external companies that are incentivized to supply the computing power for the decentralized network, said Robby Yung, CEO of Animoca Brands, in an email to GamesBeat.

“This broader distribution enables superior processing power, resulting in faster load times and smoother gameplay experiences, even for the most graphically-intensive titles,” Brink said. “Furthermore, the inherently scalable nature of decentralized networks allows for accommodating ever-evolving technological demands.”

Amy LaMeyer, managing director of the WXR Fund, said in an email to GamesBeat, “With the amount of data needed for high-resolution, real-time multiplayer games, I think caching content near the end user will continue to be necessary, as well as more optimized streaming capabilities and edge computing. I expect the visibility on eco-computing to rise with data centers looking at more climate-friendly ways to compute.”

Another critical aspect of decentralization is its potential to improve accessibility to the metaverse. By shifting the computational burden from individual devices to a distributed network, players with lower-end hardware or mobile devices can also enjoy high-quality gaming experiences. This democratization of access removes barriers to entry and encourages a more diverse and inclusive community of users and players. Of everything that the decentralized future of the metaverse brings, this is probably the most important from both a social and a business perspective, Brink said.

Not every person in Web3 expects decentralization to be so sweeping. Ryan Mullins, CEO of Aglet, believes that the metaverse “would require even more expansive data center growth and strategic proximity to end users.”

Bitcoin and Ethereum took ideological jabs and uppercuts for the amount of energy consumption they demand given the intensity of proof-of-work.

“My hypothesis would be that, should the metaverse emerge as anything more than a pandemic trend, something similar will happen to the metaverse,” Mullins said. “For example, Meta’s proposed data center in the Netherlands (to serve “the metaverse” to European users) would consume half as much energy (1,380 gigawatt-hours per year) as all other data centers in the Netherlands combined.”

Mullins believes the metaverse will be accessible via mobile people using mobile devices, and that would be a lot more energy-efficient than jacking into centralized data centers. Of course, mobile devices can use a lot of data center technology as well, but Mullins thinks that we’re going to need more data centers and more ways to enable computing and movement at the edge as well.

As for blockchain, Roblox’s Sturman said he isn’t saying no yet but he is waiting for someone to show him the user benefits of blockchain technology, except perhaps when you’re coupling the purchase of virtual Nike shoes with the purchase of Nike shoes in the real world. Sturman said Roblox’s North Star is really that game platforms of the future will focus on user-generated content.

“It is very important to have as many options open to allow ideas, flowers to bloom that you never expected,” he said.

As for the infrastructure for that, Sturman said, “I have to be ready for anything. And I always want to make it as easy for [users] as possible to do what they want to do.”

Everything Everywhere All at Once

Everything Everywhere All at Once is about the multiverse.

All of this means that the technology used to build the metaverse will likely involve the cloud, and it will be the same technology across enterprises and game worlds. It will be a new kind of distributed computing for the metaverse, and the configurations of data centers will be very dependent on the kind of processing they’re doing with their applications. We don’t know exactly what architecture will evolve to meet the demands of the metaverse, but we know it will be different from today’s architecture.

At a time when Moore’s Law has stalled and data centers are producing a lot of energy use on the planet, there are huge problems to solve with the metaverse. Let’s hope these problems will be solved.

As Lebaredian said, if networking technology comes along that can make server-to-server communication faster than communication within servers, we can solve a lot of the problems of huge battles in a tight space.

Then it becomes easier to do massive real-time simulations of games that involve a lot of people and a lot of territories all at once, regardless of whether there are snipers trying to draw a bead on one individual within a huge crowd. This definitely makes me think of the film Everything Everywhere All at Once.

If you’re Nvidia, you see the solution as bigger GPUs and more memory. But companies like Intel could see the solution in a much different way.

“With 600 million daily players, we’re already in the metaverse in some form every day,” Lebaredian said. “That’s a lot of people. If the metaverse has to be one planet, where all seven billion people on Earth can go crowded into a stadium together, I guess that’s one definition of it. I’m not sure how useful that would be. But every day, there are people having multiplayer experiences in virtual worlds that are meaningful to them. It’s just going to get bigger, more complex, and have more players in the same place at the same time.”

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Discover our Briefings.

Dean Takahashi

Source link

You May Also Like

Tesla’s Misses on Earnings. CEO Musk Frets About Growth and the Economy.

Electric-vehicle giant reported third-quarter results External link on Wednesday evening that missed…

The Economic Stakes in the Hollywood Shutdown

Trouble in Tinseltown It’s happening: America’s $134 billion movie and TV industry…

Databricks acquires Okera to boost its AI-driven data governance platform

Join top executives in San Francisco on July 11-12, to hear how…

‘Wicked sense of humour’: Raghuram Rajan gives a feel of Nobel Prize winner Douglas Diamond’s personality

Former Reserve Bank of India Governor and economist Raghuram Rajan said in…