Apple said on Friday it would take more time to collect inputs and make improvements before releasing child safety features, a month after it said it would implement a system to check iPhones for images of child sexual abuse.
More than 90 policy and rights groups around the world told Apple last month it should abandon plans for scanning children’s messages for nudity and the phones of adults for images of child sex abuse.
Critics of the plan have said that the feature could be exploited by repressive governments looking to find other material for censorship or arrests.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in a statement on Friday. — Reported by Nivedita Balu in Bengaluru and Stephen Nellis, (c) 2021 Reuters