Home United States USA — IT Silicon Valley is getting emotional

Silicon Valley is getting emotional

348
0
SHARE

Technology like the iPhone X’s new camera system and Face ID will increasingly figure out how you feel, almost all the time.
Apple’s shiny new iPhone X smartphone became available for pre-order on Friday
Packed with both bells and whistles and dominating the field in both speeds and feeds, Apple’s hotly anticipated iPhone X will be considered by some to be the world’s greatest phone.
The technology in the iPhone X includes some unusual electronics. The front-facing camera is part of a complex bundle of hardware components unprecedented in a smartphone. (Apple calls the bundle its TrueDepth camera.)
The top-front imaging bundle on the iPhone X has some weird electronics, including an infrared projector (far right) and an infrared camera (far left).
The iPhone X has a built-in, front-facing projector. It projects 30,000 dots of light in the invisible infrared spectrum. The component has a second camera, too, which takes pictures of the infrared dots to see where they land in 3D space. (This is basically how Microsoft’s Kinect for Xbox works. Apple bought one company behind Kinect tech years ago. Microsoft discontinued Kinect this week.)
Out of the box, this Kinect-like component powers Apple’s Face ID security system, which replaces the fingerprint-centric Touch ID of recent iPhones, including the iPhone 8.
A second use is Apple’s Animoji feature, which enables avatars that mimic the user’s facial expressions in real time.
Some iPhone fans believe these features are revolutionary. But the real revolution is emotion detection, which will eventually affect all user-facing technologies in business enterprises, as well as in medicine, government, the military and other fields.
Think of Animoji as a kind of proof-of-concept app for what’s possible when developers combine Apple’s infrared face tracking and 3D sensing with Apple’s augmented reality developers kit, called ARKit.
The Animoji’s cuddly, cartoon avatar will smile, frown and purse its lips every time the user does.
Apple is not granting developers access to security-related Face ID data, which is stored beyond reach in the iPhone X’s Secure Enclave. But it is allowing all comers to capture millisecond-by-millisecond changes in users’ facial expressions.
Facial expressions, of course, convey user mood, reaction, state of mind and emotion.
It’s worth pointing out that Apple last year acquired a company called Emotient, which developed artificial intelligence technology for tracking emotions using facial expressions.
My colleague Jonny Evans points out that Emotient technology plus the iPhone X’s face tracking could make Siri a much better assistant, and enable richer social experiences inside augmented reality apps.
As with other technologies, Apple may prove instrumental in mainstreaming emotion detection. But the movement toward this kind of technology is irresistible and industrywide.
Think about how much effort is expended on trying to figure out how people feel about things. Facebook and Twitter analyze “Like” and “Heart” buttons. Facebook even rolled out other emotion choices, called “reactions”: “Love,” “Haha,” “Wow,” “Sad” and “Angry.”
Google tracks everything users do on Google Search in an effort to divine results relevance — which is to say which link results users like, love, want or have no use for.
Amazon uses purchase activity, repeat purchases, wish lists and, like Google with Google Search, tracks user activity on Amazon.com to find out how customers feel about various suggested products.
Companies and research firms and other organizations conduct surveys. Ad agencies do eye-tracking studies. Publishers and other content creators conduct focus groups. Nielsen uses statistical sampling to figure out how TV viewers feel about TV shows.
All this activity underlies decision-making in business, government and academia.
But existing methods for gauging the public’s affinity are about to be blown away by the availability of high-fidelity emotion detection now being built into devices of all kinds — from smartphones and laptops to cars and industrial equipment.
Instead of focusing on how people in general feel about something, smartphone-based emotion detection will focus on how each individual user feels, and in turn will react with equivalent personalization.
Researchers have been working to crack the emotion-detection nut for decades. The biggest change now is the application of A. I., which will bring high-quality sentiment analysis to the written word, and similar processing of speech that will look at both vocal intonation and word selection to gauge how the speaker is feeling at every moment.
Most importantly, A. I. will enable not only broad and bold facial expressions like dazzling smiles and pouty frowns, but even “subliminal facial expressions” that humans can’t detect, according to a startup called Human. Your poker face is no match for A. I.
A huge number of smaller companies, including Nviso, Kairos, SkyBiometry, Affectiva, Sighthound, EmoVu, Noldus, Beyond Verbal and Sightcorp, are creating APIs for developers to build emotion-detection and tracking.
Research projects are making breakthroughs. MIT even built an A. I. emotion detection system that runs on a smartwatch .
Numerous patents by Facebook, as well as acquisitions by Facebook of companies such as FacioMetrics last year, portend a post-“Like” world, in which Facebook is constantly measuring how billions of Facebook users feel about every word they read and type, every picture they scan and every video that autoplays on their feeds.
The auto-detection of mood will no doubt replace and be superior to the current “Like” and “reactions” system.
Right now, Facebook’s “Like” system has two major flaws. First, the majority of people don’t “engage” with posts a majority of the time. Second, because sentiment is both conscious and public, it’s a kind of “performance” rather than a true reflection of how users feel. Some “Likes” happen not because the user actually likes something, but because she wants others to believe she likes it. That doesn’t help Facebook’s algorithms nearly as much as face-based emotion detection that tells the company about how every user really feels about every post every time.
Today, Facebook is the gold standard in ad targeting. Advertisers can specify the exact audience for their ads. But it’s all based on stated preferences and actions on Facebook. Imagine how targeted things will become when advertisers have access to a history of facial expressions reacting to huge quantities of posts and content. They’ll know what you like better than you do. It will be an enormous benefit to advertisers. (And, of course, advertisers will get fast feedback on the emotional reactions to their ads.)
Silicon Valley has a problem.

Continue reading...