NewsAugmented Reality

New Waveguide Tech From VividQ and Dispelix Promises New Era in AR

Waveguide AR displays can still feel “flat.” A new component could change that.


Holograms have been largely deemed impossible. However, “possible” and “impossible” are constantly shifting landscapes in immersive technology. Dispelix and VividQ have reportedly achieved holographic displays through a new waveguide device. And the companies are bringing these displays to consumers.

A Little Background

“Hologram” is a term often used in technology because it’s one that people are familiar with from science fiction. However, science fiction is almost exclusively the realm in which holograms reside. Holograms are three-dimensional images. Not an image that appears three-dimensional, but an image that actually has height, width, and depth.

These days, people are increasingly familiar with augmented reality through “passthrough.” In this method, a VR headset records your surroundings and you view a live feed of that recording augmented with digital effects. The image is still flat. Through techno-wizardry, they may appear to occupy different spaces or have different depths but they don’t.

AR glasses typically use a combination of waveguide lenses and a tiny projector called a light engine. The light engine projects digital effects onto the waveguide, which the wearer looks through. This means lighter displays that don’t rely on camera resolution for a good user experience.

See Also:  Introducing Holography as the Next Tech Milestone in Perfecting AR Glasses

Most waveguide AR projects still reproduce a flat image. These devices, typically used for virtual screens or screen mirroring from a paired device, often include spatial controls like ray casting but are arguably not “true” augmented reality and are sometimes referred to as “viewers” rather than “AR glasses.”

Some high-end waveguide headsets – almost exclusively used in enterprise and defense – achieve more immersive AR, but the virtual elements are still on a single focal plane. This limits immersion and can contribute to the feelings of sickness felt by some XR users. These devices also have a much larger form factor.

These are the issues addressed by the new technology from Dispelix and VividQ. And their material specifically mentions addressing these issues for consumer use cases like gaming.

Bringing Variable-Depth 3D Content to AR

Working together, VividQ and Dispelix have developed a “waveguide combiner” that is able to “accurately display simultaneous variable-depth 3D content within a user’s environment” in a usable form factor. This reportedly increases user comfort as well as immersion.

“Variable-depth 3D content” means that users can place virtual objects in their environment and interact with them naturally. That is opposed to needing to work around the virtual object rather than with it because the virtual object is displayed on a fixed focal plane.

VividQ 3D waveguide

“A fundamental issue has always been the complexity of displaying 3D images placed in the real world with a decent field of view and with an eyebox that is large enough to accommodate a wide range of IPDs [interpupillary distances], all encased within a lightweight lens,” VividQ CEO, Darran Milne, said in a release shared with ARPost. “We’ve solved that problem.”

VividQ and Dispelix have not only developed this technology but have also formed a commercial partnership to bring it to market and bring it to mass production. The physical device is designed to work with VividQ’s software, compatible with major game engines including Unity and Unreal Engine.

“Wearable AR devices have huge potential all around the world. For applications such as gaming and professional use, where the user needs to be immersed for long periods of time, it is vital that content is true 3D and placed within the user’s environment,” Dispelix CEO and co-founder, Antti Sunnari, said in the release. “We are thrilled to be working with VividQ.”

When Waveguides Feel Like a Mirage

Both companies have been building toward this breakthrough for a long time. Virtually every time that APost has covered Dispelix it has at least touched on a partnership with another company, which is typical for a components manufacturer. New product announcements are comparatively rare and are always the result of lots of hard work.

“The ability to display 3D images through a waveguide is a widely known barrier to [a compelling AR wearable device],” VividQ Head of Research, Alfred Newman, said in an email. “To realize the full capability, we needed to work with a partner capable of developing something that worked with our exact specifications.”

Of course, those who have been following immersive tech for a while will understand that a long time working hard to achieve a breakthrough means that that breakthrough reaching the public will require working hard for a long time. Devices using this groundbreaking technology might not reach shelves for a few more calendar pages. Again, Newman explains:

“We license the technology stack to device manufacturers and support them as they develop their products so the timeframe for launching devices is dependent on their product development. …Typically, new products take about two to three years to develop, manufacture, and launch, so we expect a similar time frame until consumers can pick a device off the shelf.”

Don’t Let the Perfect Be the Enemy of the Good

Waiting for the hardware to improve is a classic mass adoption trope, particularly in the consumer space. If you’re reading that you have to wait two to three years for impactful AR, you may have missed the message.

There are a lot of quality hardware and experience options in the AR space already – many of those already enabled by Dispelix and VividQ. If you want natural, immersive, real 3D waveguides, wait two or three years. If you want to experience AR today, you have options in already-available waveguide AR glasses or via passthrough on VR headsets.

Jon Jaehnig
the authorJon Jaehnig
Jon Jaehnig is a freelance journalist with special interest in emerging technologies. Jon has a degree in Scientific and Technical Communication from Michigan Technological University and lives in Michigan’s Upper Peninsula. If you have a story suggestion for Jon, you may contact him here.