Site icon MacTech.com

West Virginia attorney general sues Apple, claims the company hasn’t adequately blocked child sexual abuse material 

Another day, another lawsuit. West Virginia’s attorney general has filed a consumer protection lawsuit against Apple, claiming the tech giant has failed to prevent child sexual abuse materials from being stored and shared via iOS devices and iCloud services, reports CNBC.

Republican John “JB” McCuskey accuses the tech giant of prioritizing privacy branding and its own business interests over child safety. He claims other big tech companies, including Google, Microsoft, and Dropbox  have been more proactive, using systems like PhotoDNA to combat such material.

In 2021, Apple tested its own CSAM-detection features, then chose to delay them. Here’s what the company said in a statement to 9to5Mac at the time regarding the delay: Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

As CNBC notes, there’s been a lot of blowback to that decision. In 2024, UK-based watchdog the National Society for the Prevention of Cruelty to Children said Apple failed to adequately monitor, tabulate and report CSAM in its products to authorities.

And in a 2024 lawsuit filed in California’s Northern District, thousands of child sexual abuse survivors sued Apple, alleging the company never should have abandoned its earlier plans for CSAM detection features, and by allowing such material to proliferate online, it had caused survivors to relive their trauma.

CNBC says If West Virginia’s suit is successful, it could force Apple to make design or data security changes. The state is seeking statutory and punitive damages, and injunctive relief requiring Apple to implement effective CSAM detection.

I hope you’ll help support Apple World Today by becoming a patron. Almost all our income is from Patreon support and sponsored posts. Patreon pricing ranges from $2 to $10 a month. Thanks in advance for your support. 




Article provided with permission from AppleWorld.Today
Exit mobile version