Intel Says Metaverse Processing ‘NOT ENOUGH TO ENABLE THIS VISION’

intel metaverse

For many of us, the metaverse remains a mysterious, somewhat unimaginable labrynth. We aren’t comfortable pretending we understand what it is, but we know it is coming. For others, it is here and those people are already experiencing it.

As the days, weeks, and months pass, the metaverse will entice more and more people to give it a go. People will begin coexisting as virtual parallels of themselves. They may just browse the world, or meet up with long-distance friends and relatives. Maybe they’ll take their bed and breakfast virtual.

This is all going to happen whether we admit it, or like it, or not. And one processing chip company is finally stating the obvious concern outloud – do we have enough computing power for all this metaversing?

Intel was clear that it supports the idea and limitless possibilities offered up by an incoming metaverse, but it questions how all of this information and graphics will be processed on the computer side.

“The metaverse may be the next major platform in computing after the world wide web and mobile,” Raja Koduri, a senior vice president and head of Intel’s Accelerated Computing Systems and Graphics Group, said in a release.

In essence, Koduri believes that the metaverse could stall out due to lack of computing potential.

“our computing, storage and networking infrastructure today is simply not enough to enable this vision,” he writes.

If you’re confused about the metaverse, learn more about the metaverse here.

We live in a day where people just expect anything and everything virtual to work flawlessly. We’ve been conditioned through ultra-high-end gaming and Internet TV and electronic-based car systems to believe that there are no limits to our virtual capacities.

But, there are, in fact limits. And the metaverse intends to push us to all of those limits in short order. Because the fact is, more people will flock to the metaverse over time than computing can handle. The more people opening stores, buying land, and attending events, the more processing, the faster we approach unsustainable tech capacity.

When we think of what the metaverse will look like from a technical perspective, we think of Oculus headsets and cryptocurrency. But in reality, metaverse structure comes down to end-to-end computing. Limitless graphics and sound will need to be processed on someone’s home device. Moreover, there will need to be advanced servers connecting our world to a metaverse world and that will require laborious processing efforts. And that’s the biggest current issue; such servers don’t exist to handle mass capacity. Sure, a handful of people at a venue is fine, but what happens when thousands and millions want to attend?

The metaverse will connect the world, so postulating millions of people in a single metaverse environment is not hyperbole by any stretch. In fact, it is likely. And somewhere a server(s) will need to handle all this brunt and abuse.

And this comes down to devices, as well, as Koduri notes.

“Consider what is required to put two individuals in a social setting in an entirely virtual environment: convincing and detailed avatars with realistic clothing, hair and skin tones – all rendered in real time and based on sensor data capturing real world 3D objects, gestures, audio and much more; data transfer at super high bandwidths and extremely low latencies; and a persistent model of the environment, which may contain both real and simulated elements.”