x

Apple says it won’t allow any government to conduct surveillance via child abuse detection tool: Report

Shortpedia

Content Team
Image Credit: Shortpedia

Facing criticism from several quarters over its iCloud Photos and messages child safety initiatives, Apple has stressed that it will not allow any government to conduct surveillance via the tool aimed at detecting and curbing child sexual abuse material (CSAM) in iCloud photos. Last week, Apple confirmed plans to deploy new technology within iOS, macOS, watchOS, and iMessage that will detect potential child abuse imagery.