In August 2021 Apple announced a plan to scan photos users stored in iCloud for child sexual abuse material (CSAM). It was to designed detect child sexual abuse images on iPhones and iPads.

However, the plan was widely criticized and now Apple says that in response to the feedback and guidance it received, the CSAM detection tool for iCloud photos is dead, reports Wired.

“After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021,” the company told WIRED in a statement. “We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.”

On August 6, 2021, Apple previewed new child safety features coming to its various devices that were due to arrive later that year. 

Here’s the gist of Apple’s announcement at the time: Apple is introducing new child safety features in three areas, developed in collaboration with child safety experts. First, new communication tools will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable by Apple.

Next, iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.

Finally, updates to Siri and Search provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

However, following widespread criticism of the plan, in a September 3 statement to 9to5Mac, Apple said it was delaying the CSAM detection system and child safety features. 

Here’s Apple’s statement from September 2021: Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.




Article provided with permission from AppleWorld.Today