Augmented RealityNews

Ray Tracing Comes to Snap Lens Studio

Ray tracing, an advanced rendering technique, is now available for mobile AR.


One of the most powerful recent breakthroughs in graphics and rendering is coming to mobile AR thanks to a recent update to Snap’s Lens Studio. We’re talking about ray tracing.

What Is Ray Tracing?

Ray tracing is a rendering technique that helps to bring digital assets to life in the environment around them – whether that environment is digital or viewed in augmented reality. Recent examples in gaming include convincingly reflective surfaces like water, believable dynamic shadows, and improved light effects.

The technique can be fairly computing-heavy, which can be a problem depending on the program and how it is accessed. For example, when some existing games are updated to use ray tracing, users accessing that game on an older or less fully-featured computer or console may have to turn the feature off to avoid problematic latency.

Fortunately, ray tracing is being developed at the same time as new computing and connectivity methods like cloud and edge computing. These advancements allow the heavy lifting of advanced computing techniques to take place off of the device, allowing older or less fully-featured devices to run more high-level experiences smoothly.

See Also:  How Perceive Is Preparing the Future for AR Glasses With Edge Computing

While Snap releases detailing the update didn’t mention Lens Cloud, it’s likely that that feature is behind the update. Announced at the 2022 Snap Partner Summit, which also announced ray tracing for the first time, Lens Cloud provides improved off-device storage and compute, among other advancements.

The Road to Lens Studio

If you closely follow Snap, you’ve known for almost a year that this was coming. Snap also discussed ray tracing at the fifth annual Lens Fest in December. There we learned that the update has been in the hands of select developers for a while now, and they’ve been working with Snap partners to create experiences pioneering the software.

The news announced yesterday is that the feature is now in Lens Studio, meaning that any Lens creator can use it. We also have a new demonstration of the technology: a Lens created with Snap partner Tiffany & Co.

Snap ray tracing - Tiffany & Co

The company has likely been so involved in the development and showcasing of Snap’s ray tracing at least in part because the jewelry that the company is known for provides both a great challenge for and an excellent demonstration of the technology. However, Snap is already looking forward to the feature finding other use cases.

“Now, Lenses that feature AR diamond jewelry, clothing and so much more can reach ultra-realistic quality,” Snap said in the announcement.

The principal use case presented by Snap in the announcement is virtual try-on for clothing retail, like the Tiffany & Co. Lens. However, it is likely only a matter of time before the new feature finds its way into other kinds of AR experiences as well.

What’s Next?

Ray tracing is likely to be a topic yet again at the upcoming Snap Partner Summit in April, and ARPost will be there to hear about it. The online event doesn’t have the same energy as Lens Fest but as we saw here, the Partner Summit is often the first look at Snap’s developing software offerings. We always look forward to seeing what they’ll roll out next.

Jon Jaehnig
the authorJon Jaehnig
Jon Jaehnig is a freelance journalist with special interest in emerging technologies. Jon has a degree in Scientific and Technical Communication from Michigan Technological University and lives in Michigan’s Upper Peninsula. If you have a story suggestion for Jon, you may contact him here.