Home United States USA — software Apple says its new child safety feature will look for images flagged...

Apple says its new child safety feature will look for images flagged in multiple countries

226
0
SHARE

Apple has addressed privacy concerns regarding its sex abuse scanning by clarifying that the new feature would only flag accounts with at least 30 iCloud photos matching Child Sexual Abuse Material.
Recently, Apple introduced a new child safety feature that can scan images related to child sexual abuse in iCloud Photos of users on behalf of governments. However, this decision was followed by severe criticism from privacy advocates. Now, addressing the criticism, Reuters reports that the tech giant has clarified that it will employ the feature to scan images that have “been flagged by clearinghouses in multiple countries”.

Continue reading...