Home United States USA — software Apple reveals more details about its child safety photo scanning technologies

Apple reveals more details about its child safety photo scanning technologies

239
0
SHARE

Apple has provided more details about its child safety photo scanning technologies that have been drawing some fire from critics. It has also described the end-to-end flow of its review process.
Apple has been the target of criticism since it revealed that it will be introducing some child safety features into its ecosystem which would allow scanning of Child Sexual Abuse Material (CSAM). An open letter demanding that Apple halts the deployment of this technology already has thousands of signatories. The firm had internally acknowledged that some people are worried about the new features, but said that this is due to misunderstandings that it will be addressing in due course. Today, it has made good on its promise. In a six-page FAQs document that you can view here, Apple has emphasized that its photo scanning technology is split into two distinct use-cases. The first has to do with detecting sexually explicit photos sent or received by children 12 years of age or younger via the Messages app. This capability uses on-device machine learning to automatically blur problematic images, inform children that they do not have to view the content, provide them guidance, and inform their parents if they still opt to view such images.

Continue reading...