It will explain more about Computer-Generated Holography and why it represents the future of AR wearables.
Thank you for your interest in VividQ. Your submission has been received and we will be in touch shortly.
Oops! Something went wrong while submitting the form.
We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.
Presenting the Path to the New Frontier in AR at AWE USA
November 8, 2021
Epic Games, Meta, Microsoft, Nvidia… These are just a few companies actively developing the ‘Metaverse’ concept - the term referring to a convergence of physical, augmented and virtual reality in a shared online space. Augmented reality (AR) technologies will define how we access and interact with this space. As the CEO of Niantic - creators of Pokemon Go recently explained: AR will be at the heart of the Metaverse, creating natural and immersive digital content that enhances human experiences, not replace them.
While AR wearables have come a long way since Google Glass first saw the light of day, their adoption is still limited. Most consumer electronics manufacturers have converged at industrial use cases for their AR devices, where users seem to be more willing to put up with the technological limitations of the devices. However, with AR display technologies still stuck in a two-dimensional stereoscopic world, AR simply cannot move beyond "enterprise".
At AWE USA 2021 conference in Santa Clara, CA, VividQ COO and co-founder Aleksandra Pedraszewska argued that AR needs to break into a new frontier: truly immersive gaming and entertainment applications that accelerate the advent of the Metaverse. To achieve this, major challenges in user experience of AR should be resolved with Computer-Generated Holography: 3D projections that blend seamlessly with the real world.
AR needs realism. The physical space is 3D, so AR content needs to be this too. To enter the new frontier, AR headset manufacturers need to move beyond traditional stereoscopic display techniques and turn to solutions such as Computer-Generated Holography (CGH) that can provide real depth and natural focus and defocus in a digital content.
AR needs intuitive interaction. Wearable AR has to provide functionalities inaccessible on mobile. By adding genuine 3D into the next generation of AR wearables, we can open up a world of possibilities in how we interact with a digital content. With CGH, virtual objects can be placed at any depth plane naturally, and “world-locked” in relation to the physical world, making them easy and intuitive to move or touch. Holographic displays project content comfortably at arm’s length distance (from 20cm to 1 meter), allowing full use of hand input.
AR shouldn’t make us sick. Current AR devices, using stereoscopic techniques, project content at a single depth plane, usually at a focal distance far away from the user. The absence of true depth in the AR scene causes eye-strain and nausea in a large number of users. Projecting virtual content at a single depth plane results in cognitive overload, limiting the amount of information that can be presented at once.
Creating truly three-dimensional projections that blend seamlessly with the real world is crucial for the future of AR gaming and entertainment, and the advent of the Metaverse. In her presentation, Aleksandra described how VividQ enables that with the upcoming Alpha SDK, bringing holography to next-generation AR devices.
Watch the full talk (20 min) here:
To learn more about the principles behind holography, how the technology is used in AR wearables and its path to consumer adoption, download our whitepaper here.