It’s approaching 20 years since the release of Second Life, an early stab at an immersive multiplayer universe from Linden Labs, in which people began living and working—as well as making plenty of cash along the way. Two decades on, the promise first hinted at in Second Life is nearing reality, as the persistent digital world of the metaverse starts to make inroads into the mainstream.

The breathless coverage and endless hype of the metaverse would have the average person convinced that they’ll need to start planning for a life permanently attached to a VR headset.

A billion of us are due to enter the metaverse by the end of the decade, if Mark Zuckerberg has his way, while research bank Citi says that the metaverse industry will prop up an economy that could be worth anything from $8 trillion to $13 trillion by the same date. It's eye-popping figures like this that have attracted over $177 billion in investment to the metaverse since the start of 2021, according to McKinsey.

AD

There’s just one problem: the graphics of the platforms being heralded as at the forefront of that future look about the same—if not worse—than the 20-year-old Second Life.

When Meta announced the launch of its metaverse platform Horizon Worlds in France and Spain this week, it was greeted with widespread mockery. The brunt of the criticism was borne by CEO Mark Zuckerberg’s “dead-eyed,” legless cartoon avatar, forcing a hasty redesign.

Mark Zuckerberg metaverse avatar
Meta hastily rolled out an updated Mark Zuckerberg metaverse avatar. Image: Meta

It’s not just big tech’s legacy players that are afflicted. Web3 metaverse platforms like Decentraland have come in for criticism for their graphical stylings, too.

Decentraland
Decentraland's "relentlessly flat" terrain. Image: Decentraland

Decrypt’s own review of Decentraland took aim at its “relentlessly flat” terrain and pop-up. “Even on the highest settings,” said our reviewer, “it’s too limited graphically to be a particularly engrossing virtual reality experience.” CryptoVoxels, The Sandbox; they’re all rendered in blocky, cartoonish visuals reminiscent of a 2000s-vintage game.

Big empty

It all begs the question: why are the graphics so terrible in the metaverse?

AD

There are plenty of reasons why it could be the case, with different platforms offering different excuses depending on the graphical fidelity they offer.

One major issue that metaverses currently have is that rendering graphics in real time takes a lot of processing power—and superfast internet speeds that aren’t always available to users. Graphics cards and broadband connection speeds limit the ability for metaverses to present highly detailed graphics, meaning they often instead rely on broader-brush graphics.

The Sandbox
The Sandbox. Image: Decrypt

Metaverses often have worse graphics than MMO games because they are, by design, much more open-world. Rather than allowing users simply to follow a pre-programmed list of commands, which games do, the metaverse theoretically allows an infinite number of options that can’t be pre-rendered and called upon when needed.

There’s also the suggestion that having a totally cartoonish metaverse is better than the alternative: a mostly-lifelike environment with a few fatal flaws.

The concept of the uncanny valley, where graphics are almost perfect but have one thing wrong with them that unnerves users, already exists in video games. And in an environment where you’re rendering things in real time, and allowing users the option of almost limitless decisions, there are simply too many variables that could go wrong and push people into the uncanny valley.

A problem with legs

The issue is particularly vexed when it comes to legs.

For metaverses built around virtual reality interfaces, legs are “super hard and basically not workable just from a physics standpoint with existing headsets”, Andrew Bosworth, Meta’s then-vice president of Reality Labs, and now its chief technology officer, told CNN Business in February.

“It’s a hardware problem,” says Gijs Den Butter of SenseGlove, a Dutch company that develops haptic feedback gloves and devices that will be a major part of the metaverse—should we eventually fully inhabit it. “Manufacturers on this occasion have a headset, which has controllers or hand tracking, and that’s what our computer is for the metaverse,” he says. “In the current state, it doesn’t have legs, because the hardware can see your hands and maybe your arms, and track that, but when you look forward, you can’t see your legs.”

AD

That’s difficult because the body tracking algorithms that help identify where you’re pointing within the metaverse require input from body parts they can see—and as anyone stood straight and looking directly ahead of them knows, you don’t see your own legs. Therefore the computers trying to render the digital equivalent of your body in the metaverse don’t have legs.

That’s less of a problem for crypto-based metaverses like Decentraland and The Sandbox, which mostly rely on browser- or desktop-based interfaces rather than fully-immersive VR—for the time being.

“It’s really Facebook/Meta and Microsoft—these immersive platforms,” that don't have avatars with legs, says Weronika Marciniak, a Hong Kong-based metaverse architect at Future Is Meta. “Most worlds, like VRChat, Decentraland, Sandbox and others present avatars with legs, although you don’t necessarily have sensors with legs.” Those platforms get around the problem by “pretending”—before Marciniak corrects herself to “assuming the position of users’ legs.”

Den Butter says that the lack of legs in major mainstream metaverse platforms isn’t because of a lack of processing power. “Legs, like all moving parts, are basically built out of a kinematic model,” he says. “The mathematical models of hands are quite heavy, but for legs, it’s just a few points that need to be processed.”

He says low-end, existing hardware like an Azure Connect or Wii Camera could process the relevant data points—meaning transmitting and processing that data to render in the metaverse, whether locally or through edge computing, isn’t likely to cause too much lag.

Instead, he and Marciniak lay the blame for a lack of legs in the hardware limitations, and specifically the lack of visibility from existing devices worn on the head.

That’s likely to change soon, however. In December 2021, sneaker company Nike bought RTFKT, a move that Marciniak believes could be the first step towards controllers similar to headsets for our feet. “They may be working on real-life shoes or socks with sensors that would be connected to the VR headsets,” she hypothesizes.

AD

Take it on the Otherside

One metaverse that doesn’t look like all the others is Otherside, from Bored Ape Yacht Club creators Yuga Labs. Built around Improbable's M2 engine, Otherside looks like it belongs in 2022—which is no mean feat, according to those who designed it.

“We don’t just throw a platform in the way of our partners,” Rob Whitehead, co-founder and chief product officer of Improbable, tells Decrypt. They engage with partners on what they want out of the metaverse and design that. “There are some amazing projects but they look like you took an app and tried to make a metaverse out of it,” he says. “It looks like it’s sleek, but we come more from taking game-like experiences and making them more game-like and metaversal.”

Improbable devoted hours of research and development to its M2 engine to enable it to render tens of thousands of unique characters using machine learning techniques that push processing onto users’ GPUs, rather than sending the data through the cloud. “The problem is, if you double the amount of people in a dense space, you quadruple the amount of data you have to send,” says Whitehead.

Whether other metaverses will rethink their approach to visuals is another question entirely. But it’s something that’s likely to become an increasingly pressing question, if the metaverse is to achieve the mainstream adoption that its proponents want.

Stay on top of crypto news, get daily updates in your inbox.