The approach, which has not been previously reported, differs from Google’s.
Apple next week will debut tools to let two iPhone users share augmented reality while limiting the personal data sent to its servers, two people familiar with the matter said this week.
Augmented reality (AR) allows viewers to see virtual structures superimposed on their surroundings via their smartphones or other devices. It is the technology used in mobile game Pokemon Go, and by industry, such as factories seeking to map new assembly lines. Apple and rival Google are racing to release AR tools to attract software developers to their platforms.
Both are seeking to allow two people to share data so they can see the same virtual object in the same space via their individual devices. But that has sparked privacy concerns — if AR apps become commonplace, people will be scanning their homes and other personal spaces routinely, developers say.
Apple designed its two-player system to work phone-to-phone in part because of those privacy concerns, one of the people familiar with the matter said. The approach, which has not been previously reported, differs from Google’s, which requires scans of a player’s environment to be sent to, and stored in, the cloud.
Apple declined to comment. Bloomberg previously reported that Apple would announce multiplayer AR at its developer conference, which begins on Monday.
AR has become a major focus at both companies. Apple CEO Tim Cook has called it «big and profound,» and the company released its first tools to let software developers make AR apps last year.
With that release, Apple made AR possible on many phones without any modifications. The move spurred Google to abandon an AR effort that required phones to have special sensors and instead build tools for AR on conventional phones.
The race between the two has heated up since then. At its own developer conference in May, Google rolled out tools for making multiplayer AR games. The system, called Cloud Anchors, requires the first player to scan his or her environment and then upload the raw mapping data to Google’s servers, where it is translated into a rough representation of the area.
The subsequent players perform a scan that sends more limited information to the same server, which matches the phones up and lets them each see the same virtual object on the same physical space.
However, Apple’s system avoids storing any raw mapping scans of a user’s surroundings in the cloud, said the two sources. Google says it will discard raw mapping data after seven days.
The precise details of how Apple’s new system will work, or if it would support three or more players, were not available. But a phone-to-phone approach could eventually run into technical limitations. It could end up being harder to handle three or more players at a time if a player who started the game drops out, a person involved in Google’s AR efforts said.
«For artistic purposes, it’s phenomenal» to be able to precisely map out a user’s surroundings and overlay digital objects, said Joel Ogden, chief executive of Construct Studio, which makes augmented and virtual reality games.
«But we’re definitely going into some uncharted territory. There are a lot of really severe privacy implications we haven’t really explored yet,» he added.
For the latest tech news and reviews, follow Gadgets 360 on Twitter, Facebook, and subscribe to our YouTube channel .
The Best Game on the PS Vita Is a PC Port
Deadpool 2 Will Be No Fun to Watch in India Thanks to the Censor Board
My TV Is 10 Years Old and I Still Can’t Convince Myself to Upgrade