<!--DEBUG:--><!--DEBUG:dc3-united-states-it-in-english-pdf-2--><!--DEBUG:--><!--DEBUG:dc3-united-states-it-in-english-pdf-2--><!--DEBUG-spv-->{"id":1964270,"date":"2021-08-09T23:32:00","date_gmt":"2021-08-09T21:32:00","guid":{"rendered":"http:\/\/nhub.news\/?p=1964270"},"modified":"2021-08-10T05:05:52","modified_gmt":"2021-08-10T03:05:52","slug":"apple-confirms-existing-icloud-photos-will-be-scanned-for-child-abuse","status":"publish","type":"post","link":"http:\/\/nhub.news\/ru\/2021\/08\/apple-confirms-existing-icloud-photos-will-be-scanned-for-child-abuse\/","title":{"rendered":"Apple confirms existing iCloud Photos will be scanned for child abuse"},"content":{"rendered":"<p style=\"text-align: justify;\"><b>Apple may only begin scanning iCloud Photos libraries for potential child abuse images later in 2021, the company has confirmed, though the controversial system<\/b><br \/>\nApple may only begin scanning iCloud Photos libraries for potential child abuse images later in 2021, the company has confirmed, though the controversial system won\u2019t be limited to new uploads. Announced last week, the upcoming feature will rely on AI to automatically flag possible child sexual abuse materials (CSAM) in a move that has left some privacy advocates concerned. Part of the controversy came through Apple\u2019s decision to announce two child-protection-focused launches at the same time. In addition to the iCloud Photos scanning system, Apple will also be offering parents the ability to have potentially offensive images blurred automatically in their children\u2019s Messages conversations. The scanning and recognition will take place on the phone itself, in a process which seems to have been misunderstood in some quarters. For the iCloud Photos CSAM scanning, Apple will use unreadable hashes \u2013 strings of numbers representing known CSAM images \u2013 to compare them to images that a user chooses to upload to the cloud gallery service. \u201cThis set of image hashes is based on images acquired and validated to be CSAM by child safety organizations,\u201d Apple explained in a new FAQ about the system. \u201cUsing new applications of cryptography, Apple is able to use these hashes to learn only about iCloud Photos accounts that are storing collections of photos that match to these known CSAM images, and is then only able to learn about photos that are known CSAM, without learning about or seeing any other photos.\u201d According to Apple, the system won\u2019t go into operation until later this year, when iOS 15, iPadOS 15, watchOS 8, and macOS Monterey are released. However, it seems that doesn\u2019t mean images uploaded to iCloud Photos between now and then, or indeed uploaded to the service prior to the new system\u2019s announcement, won\u2019t be scanned. Images that have already been uploaded to iCloud Photos will also be processed, an Apple representative told CNBC today. However that will still rely on local, on-iPhone scanning. Photo libraries not marked for upload to iCloud Photos will not be examined for CSAM content by the new tool, and \u201cthe system does not work for users who have iCloud Photos disabled\u201d the company adds. As for concerns that the same approach could be used to target someone with fraudulent claims, Apple seems confident that\u2019s impossible. The company does not add to the existing CSAM image hashes, it points out, with that database created and validated by experts externally. \u201cThe same set of hashes is stored in the operating system of every iPhone and iPad user,\u201d Apple adds, \u201cso targeted attacks against only specific individuals are not possible under our design.\u201d While the system may be designed to spot CSAM content automatically, it won\u2019t be able to make reports directly to law enforcement. While \u201cApple is obligated to report any instances we learn of to the appropriate authorities,\u201d the company highlights, any flagged occurrence will first be checked by a human moderator. Only after that review process confirms the match will a report be made.<\/p>\n<script>jQuery(function(){jQuery(\".vc_icon_element-icon\").css(\"top\", \"0px\");});<\/script><script>jQuery(function(){jQuery(\"#td_post_ranks\").css(\"height\", \"10px\");});<\/script><script>jQuery(function(){jQuery(\".td-post-content\").find(\"p\").find(\"img\").hide();});<\/script>","protected":false},"excerpt":{"rendered":"<p>Apple may only begin scanning iCloud Photos libraries for potential child abuse images later in 2021, the company has confirmed, though the controversial system Apple may only begin scanning iCloud Photos libraries for potential child abuse images later in 2021, the company has confirmed, though the controversial system won\u2019t be limited to new uploads. Announced [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":1964269,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[90],"tags":[],"_links":{"self":[{"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/posts\/1964270"}],"collection":[{"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/comments?post=1964270"}],"version-history":[{"count":1,"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/posts\/1964270\/revisions"}],"predecessor-version":[{"id":1964271,"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/posts\/1964270\/revisions\/1964271"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/media\/1964269"}],"wp:attachment":[{"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/media?parent=1964270"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/categories?post=1964270"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/nhub.news\/ru\/wp-json\/wp\/v2\/tags?post=1964270"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}