Tuesday, April 7, saw VIVE’s second weekly developer live stream, “Build for Tomorrow – VIVE Hand Tracking SDK.”
The talk, presented by HTC’s senior developer Dario Laverde, focused on how developers can integrate hand tracking into their applications. It also hinted at some developments to come in the future.
Hand Tracking in VR Technology – It’s Come a Long Way
Laverde’s talk began with a brief history of hand tracking at VIVE. According to Laverde, this began in 2015. While doing demos, people new to VR technology would repeatedly put both controllers into one hand to reach out and try to touch digital artefacts.
“Many users who were completely new to VR were so immersed that they achieved a level of presence that made a few of them place both controllers in one hand and with a bare hand they would try to reach out to interact with the virtual objects,” said Laverde. “There was this natural inclination to use their own hands to participate in this new magical experience,” said Laverde. “Well now, as of last year, we can do that. We can do hand tracking.”
Affordances and Drawbacks
Developers incorporate hand tracking into their VR applications through a collection of SDKs collectively known as VIVE Sense. The VR technology behind these SDKs allows eye and lip tracking, pass through cameras, and hand tracking.
VIVE hand tracking has a number of features, though which features a developer uses depends on their target platform and hardware.
For example, most HTC VR technology supports 3D hand tracking. However, this requires stereo cameras, which are lacking in the original VIVE and in smartphones.
Further, as hand tracking specs go up, the framerate goes down. So, the processing power of the hardware that developers want their projects to work on should be considered.
“CPU, GPU, lighting… those are all more important factors than the actual cameras themselves,” said Laverde.
Laverde also pointed out that while VIVE’s hand tracking SDK works with a number of common platforms including Unity and Unreal, there can be complications for different hardware. Camera access is significant, particularly with smartphones.
Tips on Incorporating Hand Tracking
How and when to incorporate hand tracking isn’t only a hardware and software question. It’s also a user interface question. One should always ask whether one actually needs hand tracking at all.
“Hand tracking is not really meant to necessarily replace controllers,” said Laverde. “There are still use cases where a controller is better and vice versa.”
Laverde also cautioned developers against going overboard with hand controls and making user interfaces that were too complicated.
“Use natural interactions. [… ] Make sure you don’t need instructions. You want to reuse controller codes for as much as possible,” said Laverde.
This echoed some of the advice given by VIVE’s Bjorn Book-Larsson in last week’s talk on developing for VIVEPORT.
“The more you add to a user interface where you have to work harder and harder, the more likely it is that that user is going to be less and less likely to return,” Book-Larsson had said. “So there’s a fine tradeoff between trying to make an interesting control system and then not make it so invigorating or so complicated or so difficult to use that basically it’s too strenuous.”
Laverde’s advice on keeping controller codes in hand tracking also had to do with a future goal of his.
“Hopefully, we’ll have SDK that can handle both controllers and hand-tracking,” said Laverde. “Consider fallback solutions: If your controller is running out of battery you could switch to hand tracking within play.”
While Laverde brought this dream up at least twice in passing over the course of his talk, it didn’t come up in the roadmap section of his talk. So, as cool as it would be, it probably isn’t coming any time soon.
VIVE’s Hand Tracking Roadmap
While VIVE’s Hand Tracking SDK is exciting, it’s actually fairly limited. However, Laverde spoke of a number of improvements coming down the road.
Several of these include user interface improvements, including pinch support and custom gestures. Right now the VR technology behind the SDK only allows a few gestures and raycasting. However, Laverde used this as another opportunity to warn against over-complicated interfaces.
Laverde also said that he is looking forward to seeing how the SDK works with faceplates for the VIVE Cosmos. This was announced late last month and is not yet available, but the faceplates will allow more advanced external tracking.
Further, the VIVE Hand Tracking SDK will soon see support on Valve Index. A question regarding other third-party VR technology, including data gloves, was brought up in a Q&A segment when users asked about hand tracking gloves.
“We don’t have a program or a plan for third-party hardware but that’s something we can certainly consider,” said Laverde.
Keep Coming Back
VIVE is one of the biggest names in virtual reality and they are usually a major player at the annual Game Developers Conference. However, with GDC canceled due to coronavirus, a number of organizations have opted to post their GDC conference online.
Last week, VIVE’s first session focused on developing content specifically for their subscription service, VIVEPORT. Next week the talk will focus on “Working Remotely in VR using Vive Sync.”
To register for that session and see the future sessions in the series, visit the VIVE blog.
“I’m really excited about this webinar series,” said Laverde. “We have quite a number of really interesting ones coming up.”
We’ll also continue to offer ongoing coverage and commentary here at ARPost.