Copied link

Download our whitepaper

It will explain more about Computer-Generated Holography and why it represents the future of AR wearables.

VividQ will use the information you provide on this form to be in touch with you and to provide updates and marketing. Please let us know all the ways you would like to hear from us:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

Thank you for your interest in VividQ. Your submission has been received and we will be in touch shortly.
Oops! Something went wrong while submitting the form.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.

The Gateways to the Metaverse are Holographic

August 27, 2021
min read
The Gateways to the Metaverse are Holographic

For a new technology to reach maturity, it needs to become invisible; blend seamlessly into our daily lives. We use those once futuristic inventions like touch screens in phones or machine learning in chatbots without noticing their inherent technological complexities. It’s also true for the internet, which originated from academic research and governmental programmes because few private businesses had computational talent, resources and ambition to invest in, build, and utilise it.

What's the next big thing and how will it become "invisible"?

As the way we use technologies in our lives becomes increasingly interconnected and spatial, we start to realise that the internet today is still very much two-dimensional. It’s one of the reasons why the foundations of what became broadly known as “Metaverse” are being laid. Coined in Snow Crash, Neal Stephenson’s 1992 sci-fi novel, the term refers to a convergence of physical, augmented, and virtual reality in a shared online space. The dominant technology companies are not only taking notice of it – they are actively working towards making Metaverse happen by developing new applications, democratising content creation, and - most importantly - realising the need for new truly immersive hardware to act as gateways to the Metaverse. And as it’s often the case in the digital world - it’s gaming that is taking a lead.

Follow the gamers

While to many the Metaverse can sound like the stuff of science-fiction novels, it is a well-established concept in the gaming world. Fortnite and Roblox have been providing its players with complete virtual universes for more than a decade. In 2020, Fortnite expanded their environment beyond gaming, to host live concerts and events. Earlier this year, Epic Games, creators of Fortnite and the Unreal Engine, announced a $1 billion funding round to further expand their virtual landscapes and build a Metaverse. This world where “all IP can live together and all kinds of experiences can happen” has become Epic’s express goal.

Like in the early days of the internet, when everyone could put together their free HTML website in Notepad, the Metaverse needs to provide equally democratic ways of creating new content. The game engines’ creators have, again, led the way with Unreal rolling out tools for users with no experience in programming and design to contribute to shared virtual environments.

It's the hardware, stupid

Despite all that, we still have not seen the true “convergence of physical, augmented, and virtual” worlds. The reason for that was laid out by Mark Zuckerberg when announcing that Facebook’s overarching goal will become “to help bring the Metaverse to life”.

We're basically mediating our lives and our communication through small, glowing rectangles; that's not really how people are made to interact.
Mark Zuckerberg, Facebook CEO

Today, our gateways to the Metaverse are limited by two-dimensional laptop and smartphone screens. The current crop of AR devices are not much of an improvement, projecting content as mere flat overlays that cannot accurately place virtual content. Techniques such as parallax and stereoscopy are used to create an illusion of three-dimensionality and depth, requiring users to stay completely still in a minuscule viewing zone to see an image. It often results in unintended physiological side effects such as headaches, eye strain and nausea. A fundamental change in display technology is required to merge the digital and physical worlds and realise the vision of the Metaverse.

Set 3D content into the world with holography

Our eyes are used to seeing a continuous, three-dimensional world. Computer-Generated Holography brings this capability to digital displays. Holographic displays mimic the way that people perceive the physical world by engineering light, and by extension, retaining all depth information of the digital content. Objects, characters and blueprints are no longer limited to the confines of a screen. Information can be projected straight onto the real world, at the correct depth, and integrate seamlessly with the environment. Digital content can remain world-locked, regardless of whether it appears in a user’s hand or far on the horizon. And because holographic displays are so in tune with the eye, the usual physical discomfort from looking at the content is eliminated.

The pursuit of the Metaverse is no longer just about relevant content and applications engaging a big enough number of users. The Metaverse-oriented assets need to expand beyond that, to advanced haptic feedback technologies, brain-to-machine computing interfaces, and most importantly - appropriate display medium. As is usually the case, advanced holographic devices capable of projecting three-dimensional digital images are attracting the high-performance gaming audience first. But they will soon be embraced by leaders in communications, for immersive human interactions, and manufacturing, for representative digital twins, to make the merging of digital and physical worlds truly “invisible”.

To learn more about the application of Computer-Generated Holography in AR wearables, download our Whitepaper here.