Apple defends new child abuse prevention tech against privacy concerns


Following this week’s announcement, some experts think Apple will soon announce that iCloud will be encrypted. If iCloud is encrypted, but the company continues to detect child abuse material, pass evidence to law enforcement, and suspend the criminal, that could ease some of the political pressure on Apple executives.

would not comfort all Pressure: Many of the same governments that want Apple to do more on child abuse are also asking for more action on content related to terrorism and other crimes. But child abuse is a real and huge problem that big tech companies have often failed to do so far.

“Apple’s approach preserves privacy better than any other approach I know of,” says David Forsyth, head of computer science at the University of Illinois at Urbana-Champaign, who reviewed Apple’s system. “In my opinion, this system is likely to benefit people who own or traffic to this site. [CSAM] found; should help protect these children. Innocent users should experience minimal or no loss of privacy, as visual derivatives only show up if there are sufficient matches to CSAM images and only for images that match known CSAM images. The accuracy of the matching system, combined with the threshold, makes it highly unlikely that images without unknown CSAM images will emerge.”

How about WhatsApp?

Every major tech company is faced with the horrific reality of child abuse material on its platform. None of them came close like Apple.

Like iMessage, WhatsApp is an end-to-end encrypted messaging platform with billions of users. Like any platform of this size, they face a huge problem of abuse.

“I have read the information Apple released yesterday and am concerned,” said Will Cathcart, head of WhatsApp. tweeted out on Friday. “I think this is the wrong approach and a setback for the privacy of people all over the world. People asked if we would adopt this system for WhatsApp. Answer is no.”

WhatsApp includes reporting features so any user can report abusive content to WhatsApp. While the capabilities are far from perfect, WhatsApp reported more than 400,000 cases to NCMEC last year.

“This is an Apple-built and operated surveillance system that can be very easily used to scan private content for anything it or a government decides it wants to control,” Cathcart said in his tweets. “There will be different definitions of what is acceptable in countries where iPhones are sold. Will this system be used in China? What content will they consider illegal there, and how will we know it? How will they manage requests from governments around the world to list other types of content for screening? ?”

In its briefing to reporters, Apple emphasized that this new scanning technology has so far only been released in the United States. However, the company continued to claim that it has a history of fighting for privacy and expects to continue to do so. That way, most of it comes out to trust Apple.

The company argued that new systems cannot be easily abused by government action, and repeatedly stressed that disabling it is as easy as turning off iCloud backup.

Despite being one of the most popular messaging platforms in the world, iMessage has long been criticized for lacking the reporting capabilities now common on the social web. As a result, Apple has historically reported a small fraction of cases made by companies like Facebook to NCMEC.

Instead of adopting this solution, Apple built something completely different, and the end results have been an open and worrying question for privacy hawks. A welcome radical change for others.

“Apple’s expanded protection for children is game-changing,” NCMEC president John Clark said in a statement. “The truth is that privacy and child protection can coexist.”

high stakes

A optimistic Still, I would say that enabling full encryption of iCloud accounts when detecting child exploitation material is both an anti-exploit and a privacy gain, and perhaps even a clever political move that dampens the anti-encryption rhetoric from American, European, Indian, and Chinese officials.

A realist worries about what will come after the world’s most powerful countries. As government officials begin to imagine the surveillance possibilities of this scanning technology, it’s a virtual guarantee that Apple will – and probably already have – received phones from capital cities. Political repression is one thing, regulation and authoritarian control is another. However, this threat is neither new nor unique to this system. as a company with A track record of quiet but profitable reconciliation with ChinaApple has a lot of work to do to convince users of its ability to resist brutal governments.

All of the above may be true. What’s next will ultimately define Apple’s new technology. If this feature is used by governments as a weapon to expand surveillance, the company is not keeping up with its privacy promises.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *