Apple plans to scan iPhones in the US for imagery showing child abuse, the Financial Times reported on Thursday.
The system represents a powerful use of technology to catch violent and sexual crimes but also raises startling questions about privacy and corporate surveillance of millions of people’s phones.
The company outlined its proposed tool, known as neuralMatch, to US academics this week, the newspaper reported, citing two unidentified security researchers.
It will alert human reviewers to potentially illegal images, and that team would notify law enforcement, according to the report.
Apple didn’t immediately respond to a request for comment. — Reported by Monica Greig, (c) 2021 Bloomberg LP