[ad_1]
Please consider to support MIT Technology Review journalism subscribe to.
Aside from the “bad press,” Lloyd Richardson, technology director for the Canadian Center for Child Protection, says there isn’t much punishment for platforms that fail to remove CSAM quickly. “I think you’re going to have a hard time finding a country that fines an electronic service provider for slow or not removing CSAM,” he says.
The volume of CSAM has increased significantly worldwide during the pandemic, as both children and predators spend more time online than ever before. Child protection professionals, including the anti-trafficking organization Thornand HOPEFULLYA global network of 50 CSAM hotlines predicts that the problem will only continue to grow.
So what can be done to deal with this? The Netherlands can provide some clues. Partly due to its national infrastructure, geographic location, and status as an internet hub for global traffic, the country still has a significant CSAM problem. However, it has managed to make some significant progress. According to IWF, it has increased from hosting 41% of global CSAM at the end of 2021 to 13% by the end of March 2022.
Much of this can be attributed to the fact that he made tackling CSAM a priority when a new government came to power in the Netherlands in 2017. In 2020, it published a report naming and embarrassing internet hosting providers who were unable to remove CSAM within 24 hours of being alerted to its existence.
It seemed to work, at least in the short term. The Dutch CSAM hotline EOKM has found that providers are more willing to adopt proactive CSAM detection measures after the list is published, such as committing to removing materials quickly and removing CSAM within 24 hours of its discovery.
However, Arda Gerkens, CEO of EOKM, believes that the Netherlands has pushed the problem elsewhere instead of eliminating it. Looks like a successful model because the Netherlands has been cleared. But he did not go – he moved. And that worries me,” she says.
According to child protection experts, the solution will come in the form of legislation. Congress is currently considering a new law called the EARN IT (Eliminating Abuse and Pervasive Neglect of Interactive Technologies) Act; If this law passes, open services could lead to lawsuits for hosting CSAM on their networks and force service providers to screen. User data for CSAM.
Privacy and human rights advocates strongly oppose the law, arguing that it threatens freedom of expression and could lead to bans on end-to-end encryption and other privacy protections. But the other side of that argument, Shehan says, is that tech companies currently prioritize the privacy of those deploying CSAM on their platforms, over the privacy of the victims.
Even if MPs fail to pass the EARN IT Act, upcoming UK legislation promises to hold tech platforms accountable for illegal content, including CSAM. The UK’s Online Safety Act and Europe’s Digital Services Act could result in tech giants being fined billions of dollars if they fail to adequately tackle illegal content when the law comes into effect.
The new laws will apply to social media networks, search engines and video platforms operating in the UK or Europe, meaning US-based companies like Facebook, Apple and Google will have to comply with them to continue their operations. in England. “There’s a lot of global movement around this,” Shehan says. “There will be a ripple effect all over the world,” Shelan says.
“I would rather not have to legislate,” Farid says. “But we’ve been waiting for 20 years for them to find a moral compass. And this is the last resort.”
[ad_2]
Source link
