It appears you have not yet registered with our community. To register please click here...

 
Go Back [M] > Madshrimps > WebNews
Apple removes mentions of controversial child abuse scanning from its site Apple removes mentions of controversial child abuse scanning from its site
FAQ Members List Calendar Search Today's Posts Mark Forums Read


Apple removes mentions of controversial child abuse scanning from its site
Reply
 
Thread Tools
Old 16th December 2021, 13:54   #1
[M] Reviewer
 
Stefan Mileschin's Avatar
 
Join Date: May 2010
Location: Romania
Posts: 153,575
Stefan Mileschin Freshly Registered
Default Apple removes mentions of controversial child abuse scanning from its site

Apple has hinted it might not revive its controversial effort to scan for CSAM (child sexual abuse material) photos any time soon. MacRumorsnotes Apple has removed all mentions of the scanning feature on its Child Safety website. Visit now and you'll only see iOS 15.2's optional nude photo detection in Messages and intervention when people search for child exploitation terms.

It's not certain why Apple has pulled the references. We've asked Apple for comment. This doesn't necessarily represent a full retreat from CSAM scanning, but it at least suggests a rollout isn't imminent.

While Apple was already scanning iCloud Photos uploads for hashes of known CSAM, the change would have moved those scans to the devices themselves to ostensibly improve privacy. If iCloud Photos was enabled and enough hashes appeared in a local photo library, Apple would decrypt the relevant "safety vouchers" (included with every image) and manually review the pictures for a potential report to the National Center for Missing and Exploited Children. That, in turn, could get police involved.

The CSAM detection feature drew flak from privacy advocates. Apple stressed the existence of multiple safeguards, such as a high threshold for reviews and its reliance on hashes from multiple child safety organizations rather than government. However, there were concerns the company might still produce false positives or expand scanning under pressure from authoritarian regimes. Moreover, the only way to prevent on-device scans was to avoid using iCloud Photos altogether — you had to accept Apple's new approach or lose a valuable cloud service.

https://www.engadget.com/apple-remov...0.html?src=rss
Stefan Mileschin is offline   Reply With Quote
Reply


Similar Threads
Thread Thread Starter Forum Replies Last Post
Apple is delaying its child safety features Stefan Mileschin WebNews 0 6th September 2021 07:56
How to Stop Apple from Scanning Your iPhone Photos Stefan Mileschin WebNews 0 16th August 2021 10:08
Apple under fire for scanning users' photos Stefan Mileschin WebNews 0 10th August 2021 07:59
TikTok joins the Technology Coalition against child abuse Stefan Mileschin WebNews 0 6th May 2021 09:06
Apple caught in another child labour scandal Stefan Mileschin WebNews 0 11th November 2020 15:01
Facebook, Google and others adopt guidelines intended to fight child abuse Stefan Mileschin WebNews 0 6th March 2020 13:41
Apple has been scanning emails Stefan Mileschin WebNews 0 12th February 2020 13:02
iOS 13 beta mentions Apple’s Tile-like tracking device Stefan Mileschin WebNews 0 5th June 2019 08:41
Apple briefly pulled Telegram over child pornography distribution Stefan Mileschin WebNews 0 6th February 2018 06:25
Apple’s face security is so poor a child can break it Stefan Mileschin WebNews 0 30th September 2017 08:59

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


All times are GMT +1. The time now is 16:32.


Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
SEO by vBSEO