Apple delays rollout of CSAM detection system and child safety features

0

Apple last month announced a handful of new child safety features that have proven to be controversial, including CSAM detection for iCloud photos. Now Apple has said they will “take longer” to fine-tune the features before launching them to the public.

In a statement to 9to5Mac, Apple said:

“Last month, we announced plans for features to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse material. . Based on feedback from customers, advocacy groups, researchers, and others, we’ve decided to take more time over the next few months to gather feedback and make improvements before releasing these security features. children of crucial importance.

Apple’s new child safety features were slated to launch as part of updates to iOS 15, iPadOS 15, and macOS Monterey later this year. It is not yet clear when the company plans to roll out the features. Apple’s statement today doesn’t provide any details of what changes the company might make to improve the system.

As a reminder, here are the basics of how the CSAM detection system works as it is currently designed:

Apple’s method of detecting known CSAMs is designed with user privacy in mind. Instead of scanning images to the cloud, the system matches on the device using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further turns this database into an unreadable set of hashes that is stored securely on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against known CSAM hashes. This matching process is powered by a cryptographic technology called Private Set Intersection, which determines if there is a match without revealing the result. The device creates a cryptographic security voucher that encodes the result of the match as well as additional encrypted data on the image. This voucher is uploaded to iCloud Photos along with the image.

Upon the announcement, the new CSAM detection technology received quite a bit of backlash and criticism from privacy advocates. Apple, however, has doubled the functionality on several occasions and said its implementation would actually be more privacy-friendly than technology used by other companies like Google and Facebook.

It was also revealed through this process that Apple is already scanning iCloud Mail for CSAM, with the extension applying to iCloud Photos.

Other safety features for children that Apple announced last month, and also now delayed, include communication safety features in Messages and updated knowledge information for Siri and Search.

What do you think of Apple’s decision to delay rollout of its new child safety features? Is this the right decision, or should the company have stuck with their original plan?


Check out 9to5Mac on YouTube for more Apple news:


Source link

Leave A Reply

Your email address will not be published.