[ad_1]
![]()
Apple It announced plans to scan US iPhones for child sexual abuse footage, garnering applause from child protection groups, but has raised concerns among some security researchers that the system could be abused, including by governments seeking to spy on their citizens.
A tool called “neuralMatch”, designed to detect known images of child sexual abuse, scans images before they’re uploaded to iCloud. If it finds a match, the image is reviewed by a human. If child pornography is approved, the user’s account will be disabled and the National Center for Missing and Exploited Children will be notified.
Separately, Apple As a child safety measure, it plans to scan users’ encrypted messages for sexually explicit content, which has alarmed privacy advocates.
The detection system will only flag images found in the center’s known child pornography database. Parents who take innocent photos of a child in the bathroom probably needn’t worry. However, the researchers say that the matching tool, which doesn’t “see” such images, but only sees the mathematical “fingerprints” that represent them, may be for more malicious purposes.
Matthew Green, a senior cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly harmless images designed to trigger child pornography matches. this might be stupid AppleAlgorithm and warning law enforcement. “Researchers were able to do this quite easily,” he said of the ability to fool such systems.
Other abuses may include government surveillance of dissidents or protesters. “What happens when the Chinese government says, ‘Here’s the list of files we want you to scan,'” Green said. asked. “To do Apple say no? I hope they say no, but their technology doesn’t.”
Tech companies, including Microsoft, Google, Facebook, and others, have been sharing digital fingerprints of known child sexual abuse images for years. Apple used them to scan user files for child pornography stored in the iCloud service, which is not as securely encrypted as the data on the device.
Apple has been under government pressure for years to allow further snooping of encrypted data. Taking the necessary new security measures Apple Striking a delicate balance between preventing exploitation of children while maintaining a high-profile commitment to protecting the privacy of its users.
But the demoralized Electronic Frontier Foundation, the online civil liberties pioneer, Apple‘s compromise on privacy protections is “a shocking expression for users who trust the company’s leadership in privacy and security.”
Meanwhile, the computer scientist who invented PhotoDNA, the technology used by law enforcement to detect online child pornography more than a decade ago, recognized the potential for abuse. Apple‘s system, but the obligation to combat child sexual abuse far outweighs, he said.
“Is it possible? Sure. But is it something I’m worried about? No,” said Hany Farid, a researcher at the University of California at Berkeley, who claims that many other programs designed to protect devices from a variety of threats have never seen “this kind of mission creep.” For example, WhatsApp provides users with end-to-end encryption to protect their privacy, but also uses a system to detect malware and warn users not to click on harmful links.
Apple It was one of the first major companies to adopt “end-to-end” encryption, where messages are scrambled so that only senders and recipients can read them. But law enforcement has long pressured the company to access this information to investigate crimes such as terrorism or child sexual abuse.
Apple He said the latest changes will be made available this year as part of updates to the operating software for iPhones, Macs and Apple Watches.
“AppleExtended protection for children is changing the rules of the game, said John Clark, president and CEO of the National Center for Missing and Exploited Children. “When so many people use Apple products, these new safety measures have the potential to save lives for children.”
Julia Cordua, CEO of Thorn, said: Apple‘s technology balances “the need for privacy with digital safety for children.” Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with technology platforms.
But the Washington-based nonprofit Center for Democracy and Technology has been harshly criticized. Apple abandoning the changes, which the company says effectively destroys the “end-to-end encryption” guarantee. Scanning messages for obscene content on phones or computers effectively breaks security, he said.
The organization also questioned Apple‘s technology to distinguish between dangerous content and something as tame as art or memes. In an emailed statement, CDT said such technologies are notoriously error-prone. Apple denies that the changes are a backdoor breaking its encryption. He says they are carefully thought-out innovations that do not violate user privacy, but strongly protect it.
Separately, Apple He said the messaging app will use on-device machine learning to identify and blur explicit photos on kids’ phones, and can also alert parents of toddlers via text message. It also said that its software would “interfere” when users tried to search for issues related to child sexual abuse.
Parents will need to register their child’s phone so they can receive alerts about sexually explicit images on their child’s device. Children over 13 years old can cancel registration, that is, parents of teenagers will not receive notification.
Apple He said no features would compromise the security of private communications or notify the police.
___
AP technology writer Mike Liedtke contributed to this article.
[ad_2]
Source link
