Tech titan Apple is set to roll out a new technology that will allow the company to scan through the iCloud library of users and detect and report known child sexual abuse materials (CSAM) to the relevant law enforcement in a way, the company says, will preserve user privacy.
The new technology will come to both iOS and iPadOS and will seek to search images stored in the iCloud Photos of users. If a harmful image is detected, a report is sent to the US National Center for Missing and Exploited Children (NCMEC).
Apple says that user privacy is still a relevant concern, and that the new technology will have this in mind. This seems entirely oxymoronic, but the California-based company says, “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”
According to TechCrunch, the detection of CSAM is one of several new features implemented by Apple aimed at better protecting the children who use its service from online harm. Another feature soon launching are filters that will block potentially sexually explicit photos sent and received through a child’s iMessage account.
Another feature will seek to intervene when an iPhone user tries to search for CSAM-related terms through Siri and the device’s Search function.
“Siri and Search are also being updated to intervene when users perform searches for queries related to CSAM. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue,” Apple says.
The updates to Siri and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.
So far, these updates will only be implemented in the United States, but the company may soon expand to other regions if the technology proves successful.