Apple says researchers can examine child safety features. is suing

[ad_1]

Apple may submit the code for review – but that’s not something it says it will do. Researchers can also try to reverse engineer the feature in a “static” way, that is, without executing actual programs in a live environment.

But realistically, all of these options have at least one major problem in common: They don’t let you look at code running live on an up-to-date iPhone to see how it actually works in the wild. Instead, these methods rely not only on Apple being open and honest, but also on writing the code without major mistakes and oversights.

Another option would be to grant members of Apple’s security research appliance program access to the system to verify the company’s statements. But this group of non-Apple researchers is a highly specialized, restricted program with so many rules about what researchers can say or do that it doesn’t necessarily solve the trust issue.

That really leaves only two options for researchers who want to look inside iPhones for this sort of thing. First, hackers can jailbreak old iPhones using a zero-day vulnerability. This is difficult, expensive and can be turned off with a security patch.

“Apple has spent a lot of money preventing people from jailbreaking phones,” explains Thiel. “They specifically recruited people from the jailbreaking community to make jailbreaking more difficult.”

Or a researcher could use a virtual iPhone that can turn off Apple’s security features. In practice this means Corellium.

There are also limitations on what any security researcher can observe, but a researcher can detect if the scan goes beyond photos shared with iCloud.

However, if non-child abuse material enters their database, it will be invisible to researchers. To address this question, Apple says it will require two separate child protection organizations in different jurisdictions to both have the same CSAM image in their databases. However, it did offer a few details on how this would work, who would run the databases, which jurisdictions would be involved, and what the final sources of the database would be.

Thiel points out that the financial problem of child abuse that Apple is trying to solve is real.

“This is not a theoretical concern,” Thiel says. “This is not something people come up with as an excuse to enforce surveillance. This is a common and real problem that needs to be addressed. The solution is not to get rid of such mechanisms. It makes them as impermeable as possible to future abuse.”

But Corellium’s Tait says Apple is trying to be locked and transparent at the same time.

“Apple is trying to take their cake and eat it,” says Tait, a former information security expert with the British intelligence service GCHQ.

“They make it harder to jailbreak with their left hand and sue companies like Corellium to prevent them from existing. Now with their right hand, they’re like, ‘Oh, we built this really complex system, and it turns out some people don’t trust Apple to do it honestly – but that’s okay because any security researcher can go ahead and prove it. themselves.'”

“I’m sitting here, what do you mean you can only do this? You designed your system so they couldn’t. The only reason people can do these things is in spite of you, not because of you.”

Apple did not respond to a request for comment.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *