Apple Delays Delivery of Child Safety Features

[ad_1]

Apple said Friday it will delay the rollout of child safety measures that would allow users to scan their iPhones to detect images of child sexual abuse. criticism from privacy groups

The company announced in early August that iPhones will begin using sophisticated technology to detect images of child sexual abuse, commonly known as child pornography, that users upload to its iCloud storage service. Apple also said it will allow parents to turn on a feature that can flag them when their kids send or receive nude photos in text messages.

The measures met strong resistance from computer scientists, privacy groups, and civil liberties lawyers because the features represented the first technology that would allow a company to look at an individual’s private data and report it to law enforcement.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time in the coming months to gather input and make improvements before releasing these critical child safety features.” said apple statement posted on the website.

The feature would allow Apple’s virtual assistant, Siri, to direct people asking questions about child sexual abuse to the appropriate resources, as well as enable parents to turn on technology that scans images in their children’s text messages for nudity.

But the tool that sparked the most backlash was a software program that scanned users’ iPhone photos and compared them to a database of known images of child sexual abuse.

The tech giant announced after the changes News in The New York Times It showed that images of child sexual abuse were spreading online.

Matthew Green, a computer science professor at Johns Hopkins University, said that once the ability to review users’ private photos becomes available, it will be ripe for abuse. For example, governments could potentially rely on Apple’s technology to help track down opponents.

Green argued that Apple “will resist pressure from governments around the world, including China.” “It didn’t seem like a very secure system.”

Apple did not expect such a reaction. When the company announced the changes, it sent reporters tech explainers and clarifications applauding the efforts from child safety groups and computer scientists.

But Mr Green said the company’s move did not take into account the views of the privacy and child safety communities. “If I could design a presentation that was meant to fail, it would look like this,” he said.

Crucially, experts said, what to do now that Apple pressed pause. Will it cancel the venture entirely, offer pretty much the same features after a delay, or find a middle ground?

“We look forward to hearing more about how Apple plans to replace or improve its planned capabilities to tackle these issues without undermining end-to-end encryption, privacy, and free expression,” said Samir Jain, policy director at the Center for Democracy and Technology. , an advocacy group, said in a statement.

Joe Mullin, a policy analyst at the digital rights group Electronic Frontier Foundation, said the foundation has a petition with more than 25,000 signatures asking Apple not to promote the feature. He said it was “great that they took some time to think things through,” but that he and other privacy coalitions would continue to beg Apple to abandon his plan altogether.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *