Home United States USA — IT How TrueDepth could make Apple's AR headset actually work

How TrueDepth could make Apple's AR headset actually work

285
0
SHARE

Apple is working on an augmented reality headset, and I would be very surprised if the TrueDepth camera wasn’t inside. Teams inside the Cupertino company are
Apple is working on an augmented reality headset, and I would be very surprised if the TrueDepth camera wasn’t inside. Teams inside the Cupertino company are reportedly busy building a new rOS “reality operating system,” the custom system-on-a-package chipset to run it, and the hardware to build around that under the codename “T288.” Just as tricky as actually making an AR wearable that isn’t utterly dorky, though, is figuring out how wearers will actually interact with it.
According to Bloomberg, Apple hasn’t settled on exactly how users will interact with their AR headset. Touch panels, head gestures, and chatting with Siri are all apparently being considered. Still, I’m hoping they opt for a different solution: hand gesture recognition.
So far, we’ve seen what TrueDepth can do when it’s facing an iPhone X user. The camera array is essential for Face ID, the smartphone’s biometric security system which Apple claims is significantly more resilient to being fooled than Touch ID ever was. Its face-tracking abilities are also being used for Animojis, the animated emojis.
However I’m more intrigued by what Apple could do if it flipped the TrueDepth camera around, and focused it on what was going on in the opposite direction. In the case of an Apple wearable, as has been further rumored today, that would mean pointing TrueDepth out at the world in front of the wearer.
Gesture control for wearable computers isn’t new. All the way back in 2010, for instance, I tried Kopin’s Golden-i wearable PC: that could track basic hand movements for navigating through its menus. Of course, Kopin never intended the technology to be offered to consumers, focusing on the enterprise market which would be less averse to its bulky design.
More recently, Meta has been doing gesture recognition too. Its wearable also promises to replace the PC, projecting interfaces into an augmented reality view of the world ahead. Microsoft HoloLens has a similar system. They’re clever, but it’s questionable whether they provide sufficient resolution down to the individual finger-level; they’re also much larger than any consumer would probably want to wear.
Apple’s TrueDepth is, comparatively, tiny. It has to be, to fit into the top bezel of the iPhone X. However it packs a big technological punch: the tracking system is capable of mapping tens of thousands of points on the human face, after all. Why not direct those points at someone’s fingers instead?
Hand gestures have some big advantages over the other systems Apple is said to be considering. Touch panels require either a separate interface device – mounted on the wrist, perhaps, or built into an Apple Watch – or reaching up to touch the side of the glasses themselves. Neither approach is especially elegant nor discreet.
Head gestures, similarly, have the potential to embarrass. Tilting your head around in public is a shortcut to being the center of attention; maybe, as we did with Bluetooth headsets and seeing people apparently walking around talking to themselves, we’d get used to it over time. Even so, there’s a limit to how granular you could get in your control with a head-gesture.
Finally, there’s talking to Siri. That’s possibly the most intrusive interface of all, unless you could train people to subvocalize. If not, people walking around chatting to their AR headsets could be even more distracting than having them hold conversations on the phone nearby.
In contrast, finger tracking is much more subtle. It’s quieter, offers far more variations in input than head gestures, and could be done surreptitiously since you’d just look like you were flexing your fingers. With an AR display, a virtual keyboard could be overlaid onto the real world, on which you could virtually type. You could twiddle virtual knobs and switches and see real-time feedback of that, or simply recreate the sort of gestures the iPhone X uses with iOS 11 only on a huge display that’s invisible to those around you.
Apple isn’t alone in considering virtual controls like this. One of the Google X projects that proved most interesting over the past few years has been Soli, using a tiny radar sensor to translate finger movements like turning a dial or sliding a control into ways of interacting with software. Google said at the time that it envisaged the tech ending up in smartwatches or other small devices, where the available hardware space was insufficient for physical controls.
NOW READ iPhone X Review
Where TrueDepth might have the advantage, though, is that unlike Google’s Soli it could be further away from the hands, mounted up on the bridge of the AR glasses perhaps. It’s also a camera in its own right, too, which would be instrumental in running ARKit apps. And its place in the iPhone X will give Apple’s engineers plenty of experience trimming power consumption, a vital factor if any future wearable device is to deliver sufficient battery life without being unduly bulky.
My prediction, therefore, would be some combination of interaction options for Apple’s upcoming AR glasses. For the basics, giving Siri instructions might be enough. Anything more complex, or at those times you don’t want to draw attention to yourself, a few finger-flicks would be sufficient. If the rumors are true, we won’t have long to wait to find out: Apple is said to be readying its glasses for a potential 2020 launch.

Continue reading...