The Space Between Fact and Fiction A Metaverse Exegesis

Interoperability is one of the aspects that is responsible for a well-functioning Metaverse.

Tue Oct 04 2022

The Space Between Fact and Fiction A Metaverse Exegesis

Metaverse is all in the mainstream, and the masses are opting for innovations that’ll bring them a step closer to it. It gets cardinal to differentiate if all the fantasy is a fuss or real talk. We’ll be discussing here two base aspects that have the dynamism to evolve the technology.

Interoperability is one of the aspects that is responsible for a well-functioning metaverse. Several platforms existing today aren’t interoperable which in turn restricts sharing of experiences, information, and other resources throughout the platforms. Let’s go into history to understand this factor:

  1. The 1970s – The availability of the internet was limited to only research centres and universities
  2. The 1990s – “Proto-internets” came to existence through AOL, BBS, etc
  3. The 2000s – The Internet matured towards being interoperable Until we have smooth interoperability, there would be hindrances in sharing information and community, that’ll ultimately lead to a futile metaverse. However, this factor is still ambiguous, due to the traction present between vendors and users. Whereas to keep users in their environment, to get their investments returned, vendors would prefer a non-interoperable metaverse. On the other hand, to expand value, users and brands would pick an interoperable metaverse.

Metaverse Architecture

A framework of six layers has been developed to cover the value chain of the Metaverse, wherein the top-most layer corresponds to new user experiences while the bottom-most layer is all about hardware and software infrastructure.

Layer 1: Experience (Experience Continuum)

The uppermost layer in the stack that converges all the new use cases, experiences, and business models, existing and about to come. These use cases diminish the boundaries between real and virtual. There are three use cases:

  1. Consumer – Socialize, entertainment
  2. Enterprise – Collaboration
  3. Industrial – Modeling production line or collaborating around a digital twin

Layer 2: Human-Machine Interface (HMI)

This layer acts as a gateway for humans to perceive and interact with the third layer’s extended reality. HMIs include a combination of software and hardware that helps users to send inputs to the machine while the machine returns to the user the output, thereby forming a consistent interaction loop.

  1. Input
    1. Non-hands-free Interfaces – Keyboard, mouse, touch screen
    2. Computer Vision – Hands-free real-time gesture recognition interaction
    3. Digital Textiles – Hands-free interaction using smart textiles
    4. On-body Interaction – Human body as an interactive surface
    5. Brain-Computer Interfaces – Sending inputs to machines through thinking
  2. Output
    1. VR/AR Headsets – Immersive VR experience through sight and sound
    2. Holography – a 3D image of the subject viewed in the physical world without using special glasses
    3. Haptic Interfaces – complementing experience with haptic feedback
    4. Brain-Computer Interfaces – Sensory outputs to sense organs

Layer 3: Extended Reality (XR)

The XR layer offers an immersive experience by augmenting reality. XR reality combines physical objects with various (or single) layers of computer-generated virtual data, information, or presentation. XR includes AR, MR, AV, and VR.

A. Augmented Reality (AR) Virtual data of the objects are overlaid on real-world entities. This technology superimposes layers on top of reality without taking into account the context. The leading figures herein include – Niantic, SightCall, Streem, et al.

B. Mixed Reality (MR) MR reality refers to the intertwining of real and digital worlds. Here both worlds are merged so that one world can interact with the other. Key players here are Skywell Software, US Airforce Research Laboratory, etc.

C. Augmented Virtuality (AV) When real objects/information are placed in the virtual environment, we call it augmented virtuality. The leading player in the AV space is the Blacksburg Tactical Research Centre.

D. Virtual Reality (VR) VR involves an entirely simulated immersive environment that may or may not be similar to the physical world. The VR world can be experienced through seeing, hearing, and othe

Layer 4: World Engine

This layer contains all software that assists in the development of the virtual world, virtual objects, digital twins, and digital humans. The development of world engine technology is still incipient. Companies exploring this section include Dassault Systèmes, Epic Games, Nuke, NVIDIA, and Unity.

A. Graphics Engine It refers to the software modules necessary for creating and rendering visual layers of the virtual world. Players in this domain are CRYENGINE, Max, Unity, Amazon Lumberyard, etc.

B. Presence Engine This includes the software required to provide an accurate simulation of presence in the digital world taking into account the laws present in the physical world. An example could be 4DX cinemas.

C. Logic Engine It takes in software modules responsible for logic encompassing both software simulation of non-player characters and process interactions. Leaders here are Unity, Unreal Engine, 3DS MAX, and the ones mentioned in Graphics Engine.

D. Physics Engine This contains all software required to provide realistic physical simulations. We have the same array of players, as present in the logic and graphics engine.

Layer 5: Infrastructure

Infrastructure corresponds to the physical infrastructure enabling real-time collection and processing of data, communications, representations, and reactions. It is the essence through which the three vital properties of the Metaverse are achieved, viz., immersion, interaction, and persistence. The initial participants in this sector are MTN Group, SK Telecom, Verizon, Vodafone, etc. Areas, where infrastructure needs expansion, are Local Computing Power, Communications, and Cloud Computing.

Layer 6: Key Enablers

The sixth layer consists of all software and technologies that are necessary for the smooth functioning of the above layers. There are four major key enablers:

A. Internet of Things (IoT) It refers to the physical objects that are embedded with sensors, processing ability, and controlling software to enable interoperability. It enables digital twin (asset management), simulation (configure and manage physical devices), and AR/VR (linking real and virtual objects).

B. Artificial Intelligence (AI) AI refers to technologies that allow machines to learn and re-learn from past experiences to achieve complex goals. It enables the simulation of digital twins, realistic avatars, and computer agents (mimicking the behaviour of non-human characters).

C. Blockchain Blockchain refers to a digital ledger that contains blocks that are interconnected through cryptography. It enables asset ownership and identity and authentication.

D. Cybersecurity Cybersecurity is a broad term that takes into account the protection of computer systems and networks from information disclosure, theft, or damage. It enables security and interconnectedness.

Conclusion

Interoperability and metaverse architecture are two aspects that shape the future of the metaverse, and that’ll ultimately determine the gap that is present between fact and fiction.

References: The Metaverse Beyond Fantasy 2022| Arthur D.Little

Leave a comment

To make a comment, please send an e-mail using the button below. Your e-mail address won't be shared and will be deleted from our records after the comment is published. If you don't want your real name to be credited alongside your comment, please specify the name you would like to use. If you would like your name to link to a specific URL, please share that as well. Thank you.

Comment via email
Nikhil M
Nikhil M

Entrepreneur / Privacy Freak / Humanist / Blockchain / Ethereum / Elixir / Digital Security / Online Privacy

Tags Recent Blogs