‘The Big Wipe’: Inside Facebook’s German edition

[ad_1]

Days before the federal election in Germany, Facebook took an unprecedented step: removing a number of accounts that were working together to spread COVID-19 misinformation and encourage violent responses to COVID restrictions.

The crackdown, announced on September 16, was the first use of Facebook’s new “coordinated social harm” policy, aimed at stopping not state-sponsored disinformation campaigns but typical users making an increasingly sophisticated effort to evade hate speech or misinformation rules.

In the case of the German network, around 150 accounts, pages, and groups were linked to the so-called Querdenken movement, a loose coalition that protests quarantine measures in Germany and includes anti-vaccine and mask opponents, conspiracy theorists and some far-right. extremists.

Facebook touted the move as an innovative response to potentially harmful content; far-right commentators denounced it as censorship. But a review of the removed content and the many more Querdenken posts still available reveals that Facebook’s action was modest at best. At worst, critics say it could be a gimmick to counter complaints that it didn’t do enough to stop harmful content.

“This action appears to have been motivated more by Facebook’s desire to show action to policymakers in the days before the election, rather than an overarching effort to serve the public,” said researchers at Reset, a UK-based nonprofit that criticizes the role of social media. democratic discourse.

Facebook regularly informs journalists of accounts it removed in 2018 under policies prohibiting “coordinated falsehood,” a term it coined to describe groups or individuals working together to mislead others. He has since removed thousands of accounts that he says are mostly bad actors trying to interfere in elections and politics in countries around the world.

However, there were restrictions as not all harmful behavior on Facebook was “false”; There are many excellent authentic groups that use social media to incite violence and spread misinformation and hatred. As such, the company was limited by its policy on what it could handle.

But even with the new rule, there’s a problem with takedowns: they don’t make it clear what harmful material is left on Facebook, making it difficult to pinpoint exactly what the social network has accomplished.

Relevant example: Querdenken network. Reset was already tracking accounts removed by Facebook and published a report that concluded that only a small portion of Querdenken-related content was removed and many similar posts were allowed.

Days after Facebook announced that a young German gas station worker was shot dead by a man who refused to wear a mask, the dangers of COVID-19 extremism were highlighted. The suspect followed several far-right users on Twitter and expressed negative views about immigrants and the government.

Facebook initially refused to provide examples of the Querdenken content it removed, but eventually released four posts to the Associated Press that were no different than content still available on Facebook. It included a post falsely stating that vaccines create new viral variants, and a post wishing dead the police who broke the violent protests against COVID restrictions.

Analysis of Reset’s Facebook-removed comments found that many of them were written by people trying to refute the Querdenken arguments and did not contain false information.

Facebook defended its action, saying that account removals never meant a general ban on Querdenken, but rather a carefully measured response to users working together to break its rules and spread harmful content.

According to David Agranovich, Facebook’s director of global threat disruption, Facebook plans to improve and expand the use of the new policy going forward.

“It’s a start,” he told the AP on Monday. “This is how we expand our model of network outages to address new and emerging threats.”

Agranovich said the approach tries to strike a balance between allowing divergent views and preventing the spread of harmful content.

According to Cliff Lampe, a professor of information studying social media at the University of Michigan, the new policy could represent a significant shift in the platform’s ability to confront harmful conversations.

“They’ve tried to squash cockroaches in the past but there’s always more,” he said. “You can spend all day on your feet and get nowhere. It’s a smart attempt to go after the nets.”

Simon Hegelich, a political scientist at the Technical University of Munich, said that while the removal of the Querdenken network may be justified, it should raise questions about Facebook’s role in democratic debates.

Hegelich said Facebook is using Germany as a “trial case” for the new policy.

“Facebook is really interfering with German politics,” Hegelich said. “The COVID situation is one of the biggest problems in the election. They’re probably right that there is a lot of misinformation on these sites, but it’s still a very political issue and Facebook is interfering with it.”

Members of the Querdenken movement reacted angrily to Facebook’s decision, but many also expressed that they were not surprised.

“Massive deletion in progress,” a supporter wrote to an still-active Querdenken Facebook group, “See you on the street.”

—-

Klepper of Providence reported, RI Associated Press writer Barbara Ortutay contributed to this report from Oakland, California.

Sign up for Daily Newsletters

Copyright © 2021 The Washington Times, LLC.



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *