Copied link

Download our whitepaper

It will explain more about Computer-Generated Holography and why it represents the future of AR wearables.

VividQ will use the information you provide on this form to be in touch with you and to provide updates and marketing. Please let us know all the ways you would like to hear from us:

You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.

Thank you for your interest in VividQ. Your submission has been received and we will be in touch shortly.
Oops! Something went wrong while submitting the form.

We use Mailchimp as our marketing platform. By clicking below to subscribe, you acknowledge that your information will be transferred to Mailchimp for processing. Learn more about Mailchimp's privacy practices here.

Things Have Changed Podcast

February 23, 2021
2
min read
Things Have Changed Podcast

This month, VividQ COO Aleksandra Pedraszewska joined Adrian, Jed and Shikher on the ‘Things Have Changed’, a podcast aimed to share insight, challenge ideas, and expand our perspective on the world.

In the episode ‘Are Holographic Displays the Future of Display Technology?’, Aleksandra shared how our origin story began at the University of Cambridge and how technology is at the centre of VividQ’s innovation.

To explore the concept of computer-generated holography, Aleksandra reflected on the question: how do we represent digital data today? We currently consume information by viewing images on a flat panel screens, embedded with colourful pixels, in our personal devices. However, this is not how our human visual system is used to learning about the world around us. When we look at any object in a physical environment, we are actually looking at a pattern of light. When we hold an apple in our hand and look at it, waves of light from different sources around us reflect from it, and create a pattern, known as a wavefront, that our visual system interprets as a physical object, thanks to specific cues.

So to create a digital display that represents the real world perfectly, what needs to be done? The answer: calculate the pattern of light that, when presented on a display device with very small pixels, our eye will interpret as a three-dimensional object. That's how holographic images, commonly known as ‘holograms’, are created.

Aleksandra also discussed:

  • How VividQ will power holographic in-vehicle HUDs (head-up displays) by 2022. Automotive HUDs represent one application where projecting virtual images at the correct depth is vital for its long-term success.
  • How future consumer AR devices must be intuitive for its mass adoption. Technology companies will not need to actively persuade consumers to use holography. The technology’s inherent benefits - a full depth of field, interaction at arms-length, no eye fatigue - will attract users as the obvious choice.
  • How VividQ’s collaboration with Arm achieved a significant milestone last year: demonstrating real-time computer-generated holography on a mobile GPU for the first time. With Arm’s expertise, VividQ can power holographic display on a mobile device that uses an Arm Mali GPU. Previously, this was only possible at a minimum with a laptop-sized device.

Listen to the full podcast on Apple Podcasts, Spotify or the Things Have Changed Website.

To learn more about holography and the future of AR, download our whitepaper here.