HomeTechUK watchdog accuses Apple of failing to effectively monitor for underage sexual...

UK watchdog accuses Apple of failing to effectively monitor for underage sexual images

Date:

Related stories

UK ministers accused of £2.7bn ‘backdoor hike’ in business rates

Unlock the Editor’s Digest for freeRoula Khalaf, Editor of...

Ellen DeGeneres shows off bold hair transformation amid UK move

Entertainment gossip and news from Newsweek's network of...

New Glasgow Rolls-Royce nuclear submarine office creates new jobs

The UK’s nuclear engineering capabilities will be enhanced as...

Black Friday 2024 live: Latest offers from Hotel Chocolat to Dyson

If you’re looking to upgrade your phone this Black...

Sellafield Tech Team win big at BECBC Awards

The Britain’s Energy Coast Business Cluster (BECBC) awards are...
spot_imgspot_img

The UK’s National Society for the Prevention of Cruelty to Children says Apple has been undercounting how often child sexual abuse material appears on its platforms. According to a The Guardian story, Cupertino isn’t reporting the actual number of CSAM detections on its products.

As tech companies are obligated to share possible child sexual abuse material on their platforms with the government, Apple made 267 reports of suspected CSAM to the National Center for Missing & Exploited Children between April 2022 and March 2023. However, the NSPCC discovered that in this period, Apple was implicated in 337 recorded offenses of child abuse images.

That said, Cupertino made fewer reports than the cases that happened solely in England and Wales. In contrast, other big techs have bigger numbers. Meta reported more than 30.6M suspected CSAM material in its platform, including 1.4M in WhatsApp. Google also reported more than 1.47M materials.

It’s unclear why Apple is undercounting possible CSAM cases. However, in late 2022, the company abandoned its plans to roll out an iCloud photo-scanning tool after a backlash that it would surveil users instead of only looking for possible child sexual abuse material.

The company then said it prioritized users’ security and privacy and would roll out other features to protect children instead. Still, the UK’s watchdog points out that even though WhatsApp has end-to-end encryption like iMessage, the platform has reported infinitely more cases than Apple.

To The Guardian, Sarah Gardner,  chief executive officer of Heat Initiative, a Los Angeles non-profit focused on child protection., said: “Apple does not detect CSAM in the majority of its environments at scale, at all. They are clearly underreporting and have not invested in trust and safety teams to be able to handle this.”

While Cupertino didn’t comment on the story, it’s unclear if Apple will have to respond to the UK watchdog or if it plans to change its approach to CSAM.

BGR will keep following this story’s development.

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Latest stories

spot_img