As part of its efforts to improve child safety features, Apple revealed its plans toscan iCloud Photos for potential Child Sexual Abuse Material (CSAM)earlier last month. Following backlash from security experts and digital rights groups like Electronic Frontier Foundation, Apple has now delayed the rollout of CSAM detection.
Apple Delays Rollout of CSAM Detection
Apple was initially all set to roll out CSAM detection later this year. It is applicable to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey. The Cupertino gianthas not revealed the new date for rolling out the featurejust yet. Apple has also not detailed on what aspect of CSAM detection it is planning to improve or how it will approach the feature to offer a healthy balance between privacy and safety.
“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,”said Apple in an official statement.
Subin writes about consumer tech, software, and security. He secretly misses the headphone jack while pretending he’s better off with the wireless freedom.