Friday, September 30, 2022 | 08:41 am

Apple’s Child Sexual Abuse Detection Tech ‘Would Make Us All Less Safe and Secure’, Security Experts Warn

Apple’s Child Sexual Abuse Detection Tech ‘Would Make Us All Less Safe and Secure’, Security Experts Warn

The technology proposed by Apple to sweep iPhones for child sexual abuse images would prevent neither crimes nor intrusive surveillance, security experts have warned.

While client-side scanning (CSS) has been touted as a solution that preserves user privacy while providing law enforcement with more comprehensive investigative tools, it is a “dangerous technology”, academics from the University of Cambridge, Harvard Kennedy School, Massachusetts Institute of Technology (MIT) cautioned in a new report.

CSS technology analyses data on user devices and is at the heart of the system Apple announced would scan US iPhone users’ handsets for child sexual abuse material in August.

Photos would then be matched against known abuse material on a database maintained by the National Centre for Missing and Exploited Children, before being flagged to a human for review and potentially reported to law enforcement.

Apple confirmed it was delaying the rollout of the ‘NeuralHash’ technology, which would have scanned pictures prior to them being uploaded to iCloud Photos, the following month following a backlash from privacy advocates.

“Apple did its best, using some of the top talents in security and cryptography, and yet did not achieve a design for a secure, trustworthy, and efficacious system,” the experts wrote.

“Introducing this powerful scanning technology on all user devices without fully understanding its vulnerabilities and thinking through the technical and policy consequences would be an extremely dangerous societal experiment”.

Even if CSS was introduced initially to check devices for child sexual abuse material, it was inevitable that there would be “enormous pressure” to expand its remit and comb devices for other types of material in ways that could undermine basic freedoms, they said.

Consequently, resisting these pressures would make it incredibly hard to control abuse of the system, which would equate to mass surveillance that can have a chilling effect on both freedoms of speech and democracy itself.

Any company that claims to have created a viable solution to the issue must be subjected to “rigorous public review and testing before a government even considers mandating its use,” the authors added.

“The introduction of scanning on our personal devices—devices that keep information from to-do notes to texts and photos from loved ones—tears at the heart of privacy of individual citizens.

“CSS makes law-abiding citizens more vulnerable with their personal devices searchable on an industrial scale. Plainly put, it is a dangerous technology”.