Apple plans to scan messages and photos to keep kids sparking

[ad_1]

Apple It plans to scan its users’ messages and photos to protect children from predators, but privacy advocates say the new system will open the door to other abuses.

Starting with software updates later this year, Apple It said it would use “on-device machine learning” to warn about sensitive content in messages and look at photos stored in iCloud. AppleCloud storage service used by iPhone, iPad and Mac computer owners. “Machine learning” is largely synonymous with artificial intelligence and generally refers to the process of computers performing and learning from various tasks without human intervention and programming.

Among the new tools there is a feature AppleMessages app that alerts kids and their parents if a child is receiving or sending obscene photos. The new function will blur such detected photos and alert the child’s parent if they access the photo.

Apple‘s new software will also allow it to search for child sexual abuse material (CSAM) and match it to a database of such material maintained by the National Center for Missing and Exploited Children (NCMEC).

“Before an image is stored in iCloud Photos, matching is performed on a device against known CSAM hashes for that image,” he said. Apple in a statement explaining the plan on its website. “This matching process is powered by a cryptographic technology called ‘private set intersection’, which determines if there is a match without revealing the result.”

NS Apple The device then creates a voucher that encodes a match in the database that contains the image along with additional information about the image uploaded to iCloud.

in the frequently asked questions document published by Apple On Sunday, the company said it will not scan all photos on every user’s iPhone.

“By design, this feature only applies to photos the user has chosen to upload to iCloud Photos, and even then Apple learns only accounts that store collections of known CSAM images and only images matching known CSAM” Apple‘s document. “The system does not work for users with iCloud Photos disabled. This feature will not work with your private iPhone photo library on the device.”

movements by Apple received praise from those concerned about how predators are using modern technology to target children. Connecticut Democratic Senator Richard Blumenthal tweeted Apple‘s move was “a welcome, innovative and bold step”.

“This shows we can protect both children and our fundamental rights to privacy,” Mr Blumenthal tweeted.

But privacy advocates like the Electronic Frontier Foundation disagree. EFF’s India McKinney and Erica Portnoy wrote that while the computer giant’s intentions may seem well-intentioned, it has not kept its promises to users regarding privacy and encryption.

“Everything needed to widen the narrow back door Apple building, extending machine learning parameters to search for additional types of content, or fine-tuning configuration flags to scan accounts for everyone, not just children,” wrote Ms. McKinney and Ms. Portnoy. “This is not a slippery slope; It’s a fully built system waiting for outside pressure to make the slightest change.”

The EFF noted that authoritarian governments could be candidates to abuse this feature, and noted the example of the Indian government’s rules on pre-screening of content and new laws in Ethiopia regulating the removal of online misinformation.

Apple is not alone in developing digital tools to screen for child sexual abuse material. In February, Google said YouTube engineers had created software that detects re-uploads of child sexual abuse material and created new “machine learning classifiers” to identify never-before-seen images.

According to the MIT Technology Review, Apple It is likely to develop new tools to protect children with a few changes in store in the coming months.

Sign up for Daily Newsletters



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *