Apple Has Been Scanning iCloud E-Mails for Child Abuse Since 2019

Aug. 28, 2021



As Apple prepares toscan users’ iPhones and iCloud backupsfor photos and videos relating to Child Sexual Abuse Material (CSAM), recent reports suggest that the Cupertino giant has been scanning iCloud emails of users for such materials for the past two years.

The companyconfirmedthis in a statement to9to5Mac, who started investigating the topic after a discreet iMessage statement by Epic’s anti-fraud head Eric Friedman. He reportedly said that Apple’s iOS is the“greatest platform for distributing child porn.”Following this,9to5Macreached out to the Cupertino giant for further comment.

Apple, in a statement to the above publisher, confirmed that it has been scanning iCloud mail of users for images or attachments relating to child sexual abuse content. As per the statement, the company scans both incoming and outgoing emails of users that are sent via iCloud. And as iCloud emails are not encrypted, it is not that difficult to scan them in transit.

At the time, a cyber-security expert at the University of Surrey said,“I think the balance that Apple has drawn is a good one. It allows for the search for known extreme imagery but also has safeguards to prevent abuse of the ability to search emails.”

So, the scanning process of iCloud emails is not against the company’s policies and Apple maintains a good balance between CSAM content discovery and user privacy. However, whether the company will use the same process for scanning images for a user’s iPhone and iCloud Photos is not known yet. We will get to know more about Apple’s plans to fight CSAM on its platform once the feature releases with theupcoming iOS 15. If you don’t want Apple to scan your email, check outmail service providers on iPhone.

Bringing the latest in technology, gaming, and entertainment is our superhero team of staff writers. They have a keen eye for latest stories, happenings, and even memes for tech enthusiasts.