Home United States USA — software Apple insists developers ramp up their privacy commitments

Apple insists developers ramp up their privacy commitments

262
0
SHARE

Developers must now say how they collect data, what they do with it, and take responsibility for how it is treated if it is sold on.
Apple recently told the U. S. Congress that is sees customer privacy as a “human right”, though the explanation didn’t at that time extend to how third-party developers treat data they get from iOS apps. Now it does.
Starting October 3, Apple will insist that all third-party apps (including new apps and app updates) submitted to the App Store include a link to the app developer’s own privacy policy.
This is a big change as until now only subscription-based apps needed to supply this information – and it also extends to the privacy policy itself, which Apple insists must be clear and explicitly in explaining:
The policy must also promise that any third party to which that data is shared must provide the same or equal protection of user data as explained in the developer’s privacy policy.
The policy states that developers must:
“Confirm that any third party with whom an app shares user data (in compliance with these Guidelines) — such as analytics tools, advertising networks and third party SDKs, as well as any parent, subsidiary or other related entities that will have access to user data — will provide the same or equal protection of user data as stated in the app’s privacy policy and required by these Guidelines.”
This means Apple is insisting developers make verifiable and actionable promises about the data they gather, what they do with that information, and who they sell that data too.
This is a massively important step, and I predict that some apps – potentially including some relatively popular apps – may find themselves unable to make these promises, following legal advice.
(I’m not pointing any fingers, but any app developer whose primary business is selling people’s data to those vast unregulated data warehousing firms may find these steps make things a little trickier.)
In technology, the weakest security target is always the user, that’s why criminals try to target users with sophisticated and personalised scams .
The same is true of marketing – do you think billions would be spent on advertising if it didn’t make a difference? In the online age, marketing has become increasingly sophisticated and personalised, until now (as the scandalous behaviour of Cambridge Analytica and others of that ilk shows), those marketing techniques may now have crossed the line into criminal.
“We’ve never believed that these detailed profiles of people, that have incredibly deep personal information that is patched together from several sources, should exist,” Apple CEO TimCook recently said .
They can be “abused against our democracy,” he observed.
Apple has always insisted that it values its user privacy, but critics (often the same critics who have never concerned themselves regarding privacy when other firms have exploited it for profit) have pointed out that just because Apple is privacy-secure doesn’t mean those third-party developers are being so.
Those apps claiming your private data may lack Apple’s commitment to user privacy, they argue, before insisting Apple should do something about it.
Apple already collects much less data about customers than any other big tech firm, and that which it does collect tends to be heavily anonymized.
“The truth is, we could make a ton of money if we monetized our customer…. We’ve elected not to do that. We’re not going to traffic in your personal life. Privacy to us is a human right, a civil liberty,” Tim Cook has said.
Apple is now moving to extend those privacy protections a little, and has made several moves to try to prevent abuse of personal data for marketing and other forms of social engineering in recent months:
Apple’s new move to insist its developers also work to protect end user privacy may have profound consequences, as it implies that it will now monitor and pursue those developers who do not meet the commitments they make.
This will pose some inconvenience for companies whose business is built on gathering and exploiting user data, of course, but it also creates an even bigger divide between those platforms that do care about personal privacy, and those who don’t.
Who knows how this could impact future platform choice?

Continue reading...