Experts have been warning that, as exciting as AI and the metaverse are, these emerging technologies may have negative effects if used improperly. However, it seems like the promise of these technologies may be easier to convey than some of the concerns. A new short film, titled PRIVACY LOST, is a theatrical exploration of some of those concerns.
To learn more, ARPost talked with the writer of PRIVACY LOST – CEO and Chief Scientist of Unanimous AI and a long-time emerging technology engineer and commentator, Dr. Louis Rosenberg.
Parents and their son sit in a restaurant. The parents are wearing slim AR glasses while the child plays on a tablet.
As the parents argue with one another, their glasses display readouts of the other’s emotional state. The husband is made aware when his wife is getting angry and the wife is made aware when her husband is lying.
A waiter appears and the child puts down the tablet and puts on a pair of AR glasses. The actual waiter never appears on screen but appears to the husband as a pleasant-looking tropical server, to the wife as a fit surf-bro, and to the child as an animated stuffed bear.
Just as the husband and wife used emotional information about one another to try to navigate their argument, the waiter uses emotional information to try to most effectively sell menu items – aided through 3D visual samples. The waiter takes drink orders and leaves. The couple resumes arguing.
PRIVACY LOST presents what could be a fairly typical scene in the near future. But, should it be?
“It’s short and clean and simple, which is exactly what we aimed for – a quick way to take the complex concept of AI-powered manipulation and make it easily digestible by anyone,” Rosenberg says of PRIVACY LOST.
Creating the Film
“I’ve been developing VR, AR, and AI for over 30 years because I am convinced they will make computing more natural and human,” said Rosenberg. “I’m also keenly aware that these technologies can be abused in very dangerous ways.”
For as long as Rosenberg has been developing these technologies, he has been warning about their potential societal ramifications. However, for much of that career, people have viewed his concerns as largely theoretical. As first the metaverse and now AI have developed and attained their moments in the media, Rosenberg’s concerns take on a new urgency.
“ChatGPT happened and suddenly these risks no longer seemed theoretical,” said Rosenberg. “Almost immediately, I got flooded by interest from policymakers and regulators who wanted to better understand the potential for AI-powered manipulation in the metaverse.”
Rosenberg reached out to the Responsible Metaverse Alliance. With support from them, the XR Guild, and XRSI, Rosenberg wrote a script for PRIVACY LOST, which was produced with help from Minderoo Pictures and HeadQ Production & Post.
“The goal of the video, first and foremost, is to educate and motivate policymakers and regulators about the manipulative dangers that will emerge as AI technologies are unleashed in immersive environments,” said Rosenberg. “At the same time, the video aims to get the public thinking about these issues because it’s the public that motivates policymakers.”
Finding Middle Ground
While Rosenberg is far from the only person calling for regulation in emerging tech, that concept is still one that many see as problematic.
“Some people think regulation is a dirty word that will hurt the industry. I see it the opposite way,” said Rosenberg. “The one thing that would hurt the industry most of all is if the public loses trust. If regulation makes people feel safe in virtual and augmented worlds, the industry will grow.”
The idea behind PRIVACY LOST isn’t to prevent the development of any of the technologies shown in the video – most of which already exist, even though they don’t work together or to the exact ends displayed in the cautionary vignette. These technologies, like any technology, have the capacity to be useful but could also be used and abused for profit, or worse.
For example, sensors that could be used to determine emotion are already used in fitness apps to allow for more expressive avatars. If this data is communicated to other devices, it could enable the kinds of manipulative behavior shown in PRIVACY LOST. If it is stored and studied over time, it could be used at even greater scales and potentially for more dangerous uses.
“We need to allow for real-time emotional tracking, to make the metaverse more human, but ban the storage and profiling of emotional data, to protect against powerful forms of manipulation,” said Rosenberg. “It’s about finding a smart middle ground and it’s totally doable.”
The Pace of Regulation
Governments around the world respond to emerging technologies in different ways and at different paces, according to Rosenberg. However, across the board, policymakers tend to be “receptive but realistic, which generally means slow.” That’s not for lack of interest or effort – after all, the production of PRIVACY LOST was prompted by policymaker interest in these technologies.
“I’ve been impressed with the momentum in the EU and Australia to push regulation forward, and I am seeing genuine efforts in the US as well,” said Rosenberg. “I believe governments are finally taking these issues very seriously.”
The Fear of (Un)Regulated Tech
Depending on how you view the government, regulation can seem scary. In the case of technology, however, it seems to never be as scary as no regulation. PRIVACY LOST isn’t an exploration of a world where a controlling government prevents technological progress, it’s a view of a world where people are controlled by technology gone bad. And it doesn’t have to be that way.