UK Home officials want Apple and Google to build nudity-detection algorithms into their software by default, as part of the government’s strategy to tackle violence against women and girls, according to the Financial Times (a subscription is required to read the entire article).
The article says government officials want the operating systems on the company’s devices to prevent any nudity from being displayed unless users can verify that they’re adults through biometric checks or official ID. (The Home Office is the UK government’s interior ministry, responsible for national security, immigration, passports, policing, crime prevention, counter-terrorism, and drugs policy.)
The proposal wants Apple and Google to do this on mobile devices initially, but it could eventually extend to desktops. The government reportedly explored making the controls mandatory for devices sold in the UK, but it has apparently decided against that approach for now.
Apple currently offers Communication Safety tools that parents can activate. Once activated, if your child receives or attempts to send photos or videos that might contain nudity, Communication Safety warns them, gives them options to stay safe, and provides helpful resources.
According to Apple, Communication Safety uses on-device machine learning to analyze photo and video attachments and determine if a photo or video appears to contain nudity. Because the photos and videos are analyzed on your child’s device, Apple doesn’t receive an indication that nudity was detected and doesn’t get access to the photos or videos.
I hope you’ll help support Apple World Today by becoming a patron. Almost all our income is from Patreon support and sponsored posts. Patreon pricing ranges from $2 to $10 a month. Thanks in advance for your support.
Article provided with permission from AppleWorld.Today

