Start United States USA — software Apple Child Safety photo scanning: what you need to know

Apple Child Safety photo scanning: what you need to know

237
0
TEILEN

Apple’s Child Safety initiative will automatically scan phones for CSAM content, but how will it affect you? Read on for details.
Apple announced that it would be enacting a new protocol: automatically scanning iPhones and iPads to check user photos for child sexual assault material (CSAM). The company is doing this to limit the spread of CSAM, but also adding other features ‘to protect children from predators who use communication tools to recruit and exploit them,’ Apple explained in a blog post. For now, the features will only be available in the US. Apple will institute a new feature in iOS 15 and iPadOS 15 (both expected to launch in the next couple months) that will automatically scan images on a user’s device to see if they match previously-identified CSAM content, which is identified by unique hashes (e.g. a set of numbers consistent between duplicate images, like a digital fingerprint). Checking hashes is a common method for detecting CSAM that website security company CloudFare instituted in 2019 and used by the anti-child sex trafficking nonprofit Thorn, the organization co-founded by Ashton Kutcher and Demi More. In addition, Apple has added two systems parents can optionally enable for children in their family network: first, on-device analysis in the Messages app that scans incoming and outgoing photos for material that might be sexually explicit, which will be blurred by default, and an optional setting can inform account-linked parents if the content is viewed. Apple is also enabling Siri and Search to surface helpful resources if a user asks about reporting CSAM; both will also intervene when users search queries relating to CSAM, informing the searcher of the material’s harmful potential and pointing toward resources to get help. That’s an overview of how, by Apple’s own description, it will integrate software to track CSAM and help protect children from predation by intervening when they receive (and send) potentially inappropriate photos. But the prospect of Apple automatically scanning your material has already raised concerns from tech experts and privacy advocates – we’ll dive into that below. If you do not have photos with CSAM on your iPhone or iPad, nothing will change for you. If you do not make a Siri inquiry or online search related to CSAM, nothing will change for you. If your iPhone or iPad’s account is set up with a family in iCloud and your device is designated as a child in that network, you will see warnings and blurred photos should you receive sexually explicit photos. If your device isn’t linked to a family network as belonging to a child, nothing will change for you. Lastly, your device won’t get any of these features if you don’t upgrade to iOS 15, iPadOS 15, or macOS Monterey.

Continue reading...