Informant to Accuse Facebook of Contributing to Jan 6 Riot, Memo

[ad_1]

“We will continue to investigate – some fair and some unfair,” the note said. “But we also have to keep our heads up.”

Here is Mr. Clegg’s full note:

POLARATION AND OUR STATUS IN ELECTIONS

You may have seen the series of articles published about us in the Wall Street Journal recently and the public interest it created. This Sunday night, the former employee who leaked internal material to the Journal will appear on an episode of 60 Minutes on CBS. We understand that the article will likely argue that we are contributing to the polarization in the United States, and we argue that the extraordinary steps we took for the 2020 election were relaxed too early and contributed to the horrific events that took place in the Capitol on January 6.

I know some of you – especially those in the US – will get questions from friends and family about these things, so as we head into our weekend, I wanted to take a moment to present some content that I hope will be useful to us. work in these important areas.

Facebook and Polarization

People are understandably concerned about divisions in society and are looking for ways and answers to solve problems. Social media has had a huge impact on society in recent years, and Facebook is often the place where much of this discussion takes place. So it’s natural for people to ask if that’s part of the problem. However, the idea that Facebook is the main cause of polarization is not supported by the facts, as Chris and Pratiti pointed out in their notes on the matter earlier this year.

The rise of polarization has been the subject of serious academic research in recent years. In reality, there is no great consensus. However, the available evidence does not support the idea that Facebook, or social media more generally, is the primary cause of polarization.

The rise in political polarization in the US predates social media by several decades. If it were true that Facebook is the primary cause of polarization, we would expect to see Facebook rise wherever it is popular. Not. In fact, while polarization has risen in the US, it has declined in many countries with high social media usage.

Specifically, we expect the report to suggest that a change in Facebook’s News Feed ranking algorithm is responsible for the rise of polarizing content on the platform. In January 2018, we made ranking changes to promote Meaningful Social Interactions (MSI) so you can see more content from your friends, family, and groups in your News Feed. This change was largely driven by internal and external research showing that meaningful interaction with friends and family on our platform is better for people’s well-being, and as with all ranking metrics, we’ve developed and improved this over time. Of course, everyone has a rogue uncle or an old school friend who has strong or extreme views that we disagree with – that’s life – and the change meant you were more likely to stumble upon their posts as well. Despite that, we’ve developed industry-leading tools to remove hateful content and reduce distribution of problematic content. As a result, the prevalence of hate speech on our platform has now dropped to around 0.05%.

But the simple fact is that changes to algorithmic ranking systems on a social media platform cannot explain broader social polarization. Indeed, polarizing content and misinformation also exist on platforms that lack any algorithmic ranking, including private messaging apps like iMessage and WhatsApp.

Elections and Democracy

Perhaps there is no issue where we as a company speak out more than our work to dramatically change our approach to elections. Beginning in 2017, we began building new defenses, introducing new expertise, and strengthening our policies to prevent interference. Today, we have more than 40,000 employees working on safety and security throughout the company.

Since 2017, we have interrupted and dismantled more than 150 covert influence operations, including major democratic elections. In 2020 alone, we removed over 5 billion fake accounts and identified almost all of them before anyone reported them to us. And from March through Election Day, we removed more than 265,000 Facebook and Instagram posts in the US for violating our voter intervention policies.

Given the unusual circumstances of holding a contentious election in a pandemic, before and after election day, we have implemented and spoke publicly about so-called “windshield-breaking” measures to respond to and protect the specific and unusual signals we see on our platform. dissemination of potentially infringing content before it has been evaluated by our content reviewers against our policies.

These measures were not without compromise – they are blind tools designed to deal with specific crisis scenarios. It’s like closing an entire town’s roads and highways in response to a temporary threat lurking somewhere in a particular neighborhood. We know that in implementing them, we influence a significant amount of content that does not violate our rules to prioritize people’s safety in times of extreme uncertainty. For example, we have limited the distribution of live videos that our systems predict may be election related. This was an extreme step to help prevent potentially infringing content from going viral, but it also affected many perfectly normal and reasonable content, including those that had nothing to do with the election. Under normal circumstances we would not have taken such rough, all-encompassing measures, but these were not normal circumstances.

We reversed these emergency measures – based on careful data-driven analysis – only when we saw a return to more normal conditions. We have left some open for a longer period until February this year and have decided to keep some permanently, such as not recommending civic, political or new Groups.

Combating Hate Groups and Other Dangerous Organizations

I want to be absolutely clear: We work to limit hate speech, not expand it, and we have clear policies that prohibit content that incites violence. We do not profit from polarization, quite the opposite. We do not allow dangerous organizations, including militarized social movements or violent conspiracy networks, to organize on our platforms. We also remove content that praises or supports hate groups, terrorist organizations and criminal groups.

We’ve been more aggressive than any other internet company in tackling harmful content, including content that tries to legitimize the election. But the work we did to take down these hate groups took years. We’ve removed tens of thousands of QAnon pages, groups, and accounts from our apps, removed the original #StopTheSteal Group, and removed any references to Stop the Steal up until the launch. In 2020 alone, we removed more than 30 million content that violated our terrorism-related policies, and more than 19 million content that violated our organized hate-related policies in 2020. We designated Proud Boys as a hate organization in 2018 and continue to remove it. Remove praise, support, and representation of them. Between August last year and January 12 this year, we identified nearly 900 militia organizations under our Dangerous Organizations and Individuals policy and removed thousands of Pages, groups, events, Facebook profiles, and Instagram accounts associated with these groups.

This work will never be completed. There will always be new threats and new challenges to address in the US and around the world. That’s why we stay alert and vigilant – and we always will have to.

That is why it is sometimes so misleading to suggest that the violent uprising on 6 January would not have happened without social media. To be clear, responsibility for these events falls directly on the perpetrators of violence, those who actively promote them in politics and elsewhere. Mature democracies with widespread use of social media always hold elections – for example, in last week’s German election – without disfiguring violence. We actively share materials about these traumatic events that we can find in our services with the Police Department. But to reduce the complex causes of polarization – or insurgency in particular – in America to a technological explanation is, unfortunately, simplistic.

We will continue to face scrutiny – some fair and some unfair. We will continue to be asked tough questions. And many people will continue to be skeptical of our cause. That’s what it means to be part of a company that has a significant impact on the world. We must be humble enough to accept criticism when it is fair and make changes where it is justified. We’re not perfect and we don’t have all the answers. That’s why we do the research that’s the subject of these stories. We will also continue to look for ways to respond to the feedback we hear from our users, including testing ways to ensure that political content does not hijack News Feeds.

But we must also continue to hold our heads high. You and your teams are doing incredible work. Our tools and products have an extremely positive impact on the world and people’s lives. And you have every reason to be proud of this work.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *