Apple has further detailed that its child safety mechanism will require at least 30 photos matching with Child Sexual Abuse Material (CSAM) flagged by organisations in at least two countries before an account is flagged for human review.
After a week of criticism over its planned new system for detecting images of child sex abuse, Apple said on Friday that it will hunt only for pictures that have been flagged by clearinghouses in multiple countries. That shift and others intended to reassure privacy advocates were detailed to reporters in an unprecedented fourth background briefing since the initial announcement eight days prior of a plan to monitor customer devices. After previously declining to say how many matched images on a phone or computer it would take before the operating system notifies Apple for a human review and possible reporting to authorities, executives said on Friday it would start with 30, though the number could become lower over time as the system improves. Apple also said it would be easy for researchers to make sure that the list of image identifiers being sought on one iPhone was the same as the lists on all other phones, seeking to blunt concerns that the new mechanism could be used to target individuals.