Домой United States USA — IT Apple confirms existing iCloud Photos will be scanned for child abuse

Apple confirms existing iCloud Photos will be scanned for child abuse

282
0
ПОДЕЛИТЬСЯ

Apple may only begin scanning iCloud Photos libraries for potential child abuse images later in 2021, the company has confirmed, though the controversial system
Apple may only begin scanning iCloud Photos libraries for potential child abuse images later in 2021, the company has confirmed, though the controversial system won’t be limited to new uploads. Announced last week, the upcoming feature will rely on AI to automatically flag possible child sexual abuse materials (CSAM) in a move that has left some privacy advocates concerned. Part of the controversy came through Apple’s decision to announce two child-protection-focused launches at the same time. In addition to the iCloud Photos scanning system, Apple will also be offering parents the ability to have potentially offensive images blurred automatically in their children’s Messages conversations. The scanning and recognition will take place on the phone itself, in a process which seems to have been misunderstood in some quarters. For the iCloud Photos CSAM scanning, Apple will use unreadable hashes – strings of numbers representing known CSAM images – to compare them to images that a user chooses to upload to the cloud gallery service.

Continue reading...