UK watchdog accuses Apple of downplaying child-abuse material cases
The National Society for the Prevention of Cruelty to Children (NSPCC) in the UK has accused tech giant Apple of significantly underreporting incidents involving Child Sexual Abuse Material (CSAM) on its platforms. The NSPCC claims to have discovered more instances of abuse images on Apple's platforms within UK than what Apple reported globally. In 2022, amid surveillance concerns, Apple scrapped its plans for CSAM detection and instead introduced a feature called Communication Safety that blurs nude photos sent to children.