[ad_1]
Apple on Thursday announced changes to iPhones designed to capture nude photos of children or sent to children; This is a move that would probably please parents and the police, but it was already a concern for privacy watchdogs.
Apple said iPhones will begin using sophisticated technology later this year to detect images of child sexual abuse, commonly known as child pornography, that users upload to Apple’s cloud storage service called iCloud. Apple also said it will soon enable parents to enable a feature that can mark their kids when they send or receive nude photos in a text message.
Apple said it designed the new features to protect users’ privacy, including ensuring that a child never sees or learns of nude images shared in text messages. Scanning is done on the child’s device and notifications are sent to parents’ devices only. Apple cited some cybersecurity experts and child safety groups praising Apple’s approach.
But other cybersecurity experts were still worried. Matthew D. Green, professor of cryptography at Johns Hopkins University, said Apple’s new features set a dangerous precedent by creating surveillance technology that could be used by law enforcement or governments.
“They sell privacy to the world and get people to trust their devices,” Mr Green said. “But now they are basically surrendering to the worst possible demands of every government. I don’t see how they’re going to say no after that.”
Mixed reviews of Apple’s new features show the fine line tech companies must walk between helping public safety and ensuring customer privacy. While law enforcement has complained for years that technologies like smartphone encryption hinder criminal investigations, technology executives and cybersecurity experts have argued that this type of encryption is crucial to protecting people’s data and privacy.
This story will be updated.
Michael H. Keller and Gabriel JX Dance contributed to the reporting.
[ad_2]
Source link