Intel Jumps Head-First Into the Metaverse

Intel has finally spoken publicly about the Metaverse, the latest push from Meta (previously known as Facebook) toward a digitally-interconnected world. As the world’s largest semiconductor manufacturer, Intel is a natural player in this science-fiction-meets-reality story. Its technological solutions are part of the backbone of computing itself, whether local (in your PC) or distributed (in the cloud). However, Intel seems to think its greatest efforts toward enabling and supporting the vision for an eventual Metaverse space won’t come from the hardware realm. Instead, Intel is focusing on the elegant hell of software.

The basic gist of the situation is this. Intel has taken giant’s steps since the introduction of the semiconductor in increasing available computing performance. These advances mean increased performance at a chip level, and the parallel advances in cloud infrastructure that can now stream interactive experiences (i.e., games) to a low-power, local device. But even these devices are now powerful enough to drive their own experiences. The CPUs in our mobile phones are now more powerful than the ones that were employed in the Xbox 360 and PS3 consoles, and GPUs are heading in the same direction.

In an interview with Quartz, Raja Koduri, head of Intel’s accelerated computing systems and graphics group, said that ” the [personal computers] are getting better, the phone is amazing these days, you’ve got a two-teraflop GPU in the phone… and then you have cloud. There’s lots of progress made, but it is not enough.” While it may sound impractical today, companies are working on bringing enough graphics performance to phones to support ray tracing — and it will come, in time.

These devices and others with embedded processing power are mostly left idle. The question then becomes: What if we could build an infrastructure that would allow for available computing resources in a network to be pooled together irrespective of manufacturer, and get them seamlessly and transparently working for the same goal? Koduri seems to think that this is an essential element in enabling a true Metaverse experience.

“One foundational thing we always knew is that for what we imagined in Snow Crash, what we imagined in Ready Player One, for those experiences to be delivered, the computational infrastructure that is needed is 1000 times more than what we currently have.”

As reported by Reuters, Intel is currently developing such a software solution that would enable for computing resources to be pooled together according to usage requirements. Of course, that resource pooling across networks — and across vast stretches of physical space between your home and Microsoft’s Xbox Cloud render farms, for example — requires many pieces to work in lockstep. Imagine powering up your laptop in your bedroom, starting a game, and your system automatically powers up other devices on your network, like a game consoles or a PC packing one of the best graphics cards.

“The compute that you need to render a photo-realistic you of me or your environment needs to be continued anywhere,” added Raja Koduri. “That means that your PCs, your phones, your edge networks, your cell stations that have some compute, and your cloud computing needs to be kind of working in conjunction like an orchestra—between all of these three elements that deliver that kind of beautiful metaverse. It’ll take time.”

Just how much time isn’t exactly clear, but Intel is already working on the problem. And while software will undoubtedly play a big role, don’t count out the hardware aspect. Something tells me we’re not going to be joining the Metaverse on current generation smartphones and PCs, no matter how bullish Meta and Intel might seem.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button