paint-brush
Why the Metaverse Is Everything Everywhere All at Onceby@tprstly
161 reads

Why the Metaverse Is Everything Everywhere All at Once

by Theo PriestleyJune 4th, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

XReal are glasses that combine augmented reality and virtual reality. By completely blocking out the wearer's vision, they offer the ability to access a 3D virtual environment. This concept not only enhances mobility but also leverages social cues we use in our daily lives. To enable the metaverse, we require hardware that can be comfortably worn for extended periods, overlaying digital filters onto our physical reality.

People Mentioned

Mention Thumbnail
featured image - Why the Metaverse Is Everything Everywhere All at Once
Theo Priestley HackerNoon profile picture

Around 8 months ago I wrote a post explaining how the concept of the metaverse is not a singular destination but just multiple layers of digital reality - available all around us to tap into whenever and however we choose to. This is primarily why so many projects have failed already, including Meta’s own Horizon Worlds which wanted us to log into a virtual rendering of the same environments we inhabit today.


I mean, why the fuck would you want to hold a meeting in a virtual office filled with approximated avatars of yourselves and be forced to wear headsets in order to converse?


Decentraland, VRchat, NeosVR, The Sandbox, Minecraft, Roblox, and Fortnite…they’re all variations of the same theme and experience. Still, you can’t take them with you wherever you go or access them however you want. When pundits argue over whether augmented reality, mixed reality, virtual reality, or extended reality are going to win over one another what they’re really doing is arguing which layer of the metaverse will be more dominant over another and this is an entirely wrong way to look at it.


In the wake of AWE2023 and the talks around the technology and form factors, user privacy, and the future of an augmented world I’m reminded of the video ‘Hyper Reality’ by Keiitchi Matsuda.



Behold the future of the metaverse…or maybe not



To enable the metaverse, we require hardware that can be comfortably worn for extended periods, overlaying digital filters onto our physical reality. While we don't currently have such technology, XReal glasses represent a promising step toward achieving this vision. These glasses have the potential to evolve into a device that combines augmented reality and virtual reality, providing immersive experiences wherever you are. By completely blocking out the wearer's vision, they offer the ability to access a 3D virtual environment. This concept not only enhances mobility but also leverages social cues we use in our daily lives. During a recent event I attended, I had conversations that sparked the idea of wearing these glasses on public transport and using them to transport oneself to another virtual space during the journey.


In a rather dystopian picture, buses and trains will be filled with people wearing blackout sunglasses and not talking to each other…but that’s hardly a departure from where we are today now, is it?



We’re so fucking anti-social etc etc



Being able to flip between different layers of reality is what will make the metaverse become what it is meant to be. Forget Snowcrash and Ready Player One, they've become a distracted and tired Hollywood vision of what the metaverse is supposed to be for everyone, not just a few. If you watch Hyper Reality then it portrays the worst of what augmented reality could represent without user control — and this is where Web3 comes into play — the ideology that data and identity sovereignty are once again under the control of the individual reigns supreme within this metaverse context.


But it is a choice.


People still living with Web2 may wish for a future where they receive hyper-personalization and are bombarded by brands every minute in every layer, whereas people embracing Web3 will ultimately control just how much of that filters through to them.


In a new video, “Privacy Lost”, we see an augmented world where users have no control and our data is used to coerce and manipulate us into buying decisions — it’s very Black Mirror and insidious. But it need not be this way, we should choose what we want to engage with.


It’s Push vs Pull and at the heart the very argument between Web2 and Web3.


Paradise at the Lost and Found?



In Web2 we’re being pushed towards every decision by the data we’ve relinquished for the sake of perceived convenience. In Web3 we pull and draw the information we want when we want to make those informed decisions and lose none of the privacy or convenience.


This will also bring headaches to brands and marketers trying to understand how to bring a customer through an experience that transcends realities. How the customer moves through this experience is now up to them — there is no funnel. The customer journey can therefore begin anywhere, move anywhere, and end anywhere.


When you consider the implications here you begin to understand why when some tech journo claims that Fortnite or Horizon Worlds or Decentraland is the metaverse they’re talking a load of bullshit.


Now, Nike is loading NFT sneakers into EA sports games for players to use on their avatars and we’re getting close to understanding where Web3 and some blockchain technology can be applied in the right ways. Wearing NFT sneakers on a gamer avatar is one thing but it’s locked away in an environment a majority will never experience or want to. But imagine wearing them on a universal avatar that anyone can see, almost every day if they encounter you in the street while wearing glasses. Your own user filter could be set to portray you as an avatar to everyone else wearing glasses and engaged in the metaverse or set within certain privacy parameters (ie people you know may see you as you are, others may see some aspects of both physical and digital to protect some identity, brands may see nothing at all).



Interoperability so far is about being able to move your NFT receipt from one web3 video game to another or trying to claim that you can take your actual game or digital asset that you “own” from one world and make it work and behave exactly as it should in another.


Let me paint another picture of interoperability that I haven’t heard anyone really discuss at all — the interoperability of assets between realities.


Interoperability should be about everything between these layers, not just assets*,* and that includes your identity. You can set your identity to reveal aspects of yourself and the associated personal data seamlessly as you move between one reality and another. In order to handle this then we need both a new type of operating system specifically designed to handle this kind of digital inception and an AI capable of orchestrating this level of data and interaction.


None of these pieces yet exist, and if they do they exist in their own silos rather than building more of a cohesive puzzle.


  1. We need a wearable device that can operate between realities and allow mainstream adoption that’s as widely accepted as the mobile phone but ultimately removes our reliance on that screen itself. This means that even the newest crop of devices like Meta Quest 3 and Apple’s XR device about to be unveiled at the time of writing just won’t cut it.


  2. We need a completely new operating system, not another layer one blockchain like Lamina1, that powers a spatial universe that permeates our own reality and adds multiple dimensions to it.


  3. We need filters, not applications, that allow us to engage in ways under our control. The metaverse is already filled with the information, experiences and services that exist today that we use on a daily basis each and every minute, we just apply the filters in order to access

    them.


    This is a paradigm shift when thinking about everything everywhere all at once rather than subscribing and downloading apps as we do today. This is why we need new thinking around operating systems.


    Back in 2015, Google hinted at the death of the app, by streaming the app service via a virtual machine in the cloud rather than downloading the app itself — it never took off but in a way, it hinted at where the metaverse needs to go forward with.


  4. Web3 and Artificial Intelligence are absolutely core to not only delivering and managing our transition between realities but preserving control and our privacy. It’ll be the hardest part to understand and build, primarily because in the early days the Web2 world will dominate the layered reality we forge.



The “influencers” are dumping the metaverse and Web3 for the current AI hype cycle — good riddance, you’re missing the big picture — not unlike Neal Stephenson on his personal quest to build the metaverse in the image of Snowcrash.


Eventually, we may get the metaverse. But, unless these layered realities all contain the same usefulness as the existing web with the mobility we enjoy today within spatial and immersive environments that we control then we’re going to end up with more fragmented visions of Fornite on top of Hyper Reality.



Also published here.


The lead image for this article was generated by HackerNoon's AI Image Generator via the prompt "Everything everywhere all at once."