How Facebook Couldn’t Prevent Racist Abuse of England’s Football Players


In May 2019, Facebook asked the organizers of English football to their London office in Regent’s Park. On the agenda: What to do about the growing racial harassment on the social network against black football players?

Two people familiar with the conversation said at the meeting that Facebook thought there was a conflict with representatives of England’s four main football organisations: the Football Association, the Premier League, the English Football League and the Professional Footballers’ Association. Company executives told the group they had many issues to deal with, including content related to terrorism and child sexual abuse.

A few months later, Facebook provided football representatives with an athlete safety guide, including guidelines on how players can protect themselves from bigotry using their tools. The message was clear: it was up to players and clubs to protect themselves online.

The interactions marked the beginning of a more than two-year-long campaign by English football to pressure Facebook and other social media companies to rein in online hate speech against their players. Football officials have met with the platforms multiple times since then, sending an open letter calling for change and organizing social media boycotts. Facebook employees joined in, demanding that it do more to stop the harassment.

After the raid, the pressure increased European championship last month, England’s three Black players exposed to a flood of racial nicknames on social media for missing penalty kicks in the decisive shots of the last match. Prince William denounced hatred, and UK prime minister Boris Johnson has threatened regulation and fines for companies that continue to allow racial abuse. On Facebook, the incident was escalated to a “Site Incident 1” equivalent to five company-wide alarm fires.

Yet as England’s top league, the Premier League, opens its season on Friday, football officials said social media companies – especially the biggest Facebook – were not taking the issue seriously enough and players were steeling themselves for online hate again.

“Football is a growing global market that includes clubs, brands, sponsors and fans who are fed up with the tech giants’ apparent reluctance to develop in-platform solutions to the problems we deal with every day,” said Simone Pound. President of the Professional Footballers’ Association, equality, diversity and inclusion for the players union.

The impasse of English football is another example that Facebook hasn’t been able to solve. speech problems on the platform, even after being aware of the level of abuse. While Facebook has introduced some measures to reduce harassment, football officials said they were insufficient.

Sanjay Bhandari, president of Kick It Out, an organization that promotes equality in football, said social media companies weren’t doing enough “because the pain wasn’t enough for them”.

This season, Facebook is trying again. According to an internal document obtained by The New York Times, the Instagram photo-sharing app is expected to roll out new features Wednesday to make it harder to view racist material. One of them will allow users to hide potentially abusive comments and messages from accounts that don’t follow them or have followed them recently.

“The unfortunate truth is that tackling racism on social media is just as complex as tackling racism in society,” said Karina Newton, head of global public policy at Instagram. “We have made significant strides, many of which have been driven by discussions with abused groups such as the UK football community.”

However, Facebook executives privately admit that racist rhetoric against English footballers will continue. “No one can solve this challenge overnight,” wrote Steve Hatch, Facebook’s director of Britain and Ireland, in an internal note reviewed by The Times last month.

Some players seem to have succumbed to the abuse. Bukayo Saka, 19, one of the black players who missed penalties for England four days after the European Championship final, Posted on Twitter and Instagram He said “powerful platforms are not doing enough to stop these messages” and called it “a sad reality”.

At the same time, Facebook employees continued to report hateful comments to their employers to remove Mr. Saka’s posts. One reported – an Instagram comment that reads “Sister stay in Africa” ​​- apparently didn’t break the platform’s rules, according to the auto-move system. It stayed up.

Much of the racist abuse in English football is due to Raheem Sterling and Marcus Rashford. Mr Bhandari said about 30 percent of players in the Premier League are Black.

Over time, these players have been harassed in football stadiums and on Facebook, where users are asked for their real names, and on Instagram and Twitter, which allow users to be anonymous. In April 2019, some players and two former captains of the national team, David Beckham and Wayne Rooney, fed up with their behavior, joined a 24-hour social media boycott, posting red badges with the hashtag on Instagram, Twitter and Facebook. #Competence.

A month later, English football officials had their first meeting with Facebook and were disappointed. Facebook said that “feedback from the meeting was taken into account and impacted further policy, product and app efforts.”

Last year blood pressure rose The murder of George Floyd by the police in Minneapolis. When the Premier League restarted in June 2020 after a 100-day coronavirus hiatus, athletes from all 20 clubs started each match on their knees. The players, who continued their symbolic movement last season, said that they will kneel this season as well.

This has exacerbated online abuse even more. In January, Mr. Rashford tweeted “humanity and social media the worstFor the bigoted messages he gets. Two of his Manchester United teammates, who are also Black, Targeted Instagram with monkey emojis – dehumanizing – after a loss.

Employees at Facebook took note of the increase in racist speech. In an internal forum meant to report negative press to the communications department, an employee began cataloging articles about abused English football players on Facebook platforms. By February, the list had grown to about 20 different news clips in a month, according to a company document seen by The Times.

English football organizations continued to engage with Facebook. This year, organizers included Twitter in the conversations, creating what is known as the Online Hate Working Group.

But football officials said they were frustrated by the lack of progress. Edleen John, head of international affairs and corporate affairs at the Football Association, England’s governing body for sport, said there was no indication that the top leaders of Facebook and Twitter were aware of the abuse. He and the others began to discuss writing. open letter To Mark Zuckerberg and Jack Dorsey, CEOs of Facebook and Twitter.

“Why don’t we try to connect and hold meetings with people at the very top of the organization and see if that will make a difference?” Ms. John said in an interview, explaining her thoughts.

In February, senior executives from the Premier League, Football Association and other groups published a 580-word letter to Mr. Zuckerberg and Mr. Dorsey accusing them of “inaction” against racial abuse. They requested companies to block racist and discriminatory content before it is posted or published. They also forced user authentication so that criminals could be rooted out.

However, Ms. John said we “did not receive a response” from Mr Zuckerberg or Mr Dorsey. In April, English football organizations, players and brands came together. four-day boycott of social media.

Twitter, which declined to comment, said: in a blog post about racism He said on Tuesday he was “appalled by those who targeted players from the England football team with racial abuse following the Euro 2020 Final”.

On Facebook, members of the policy team who set the rules on what content should stay up or down, three people familiar with the conversations, said they were pushed back against the demands of football officials.

They argued that terms or symbols used for racial abuse, such as the monkey emoji, could mean different things depending on the context and should not be banned outright. They argued that authentication could also undermine anonymity on Instagram and create new problems for users.

In April, Facebook announced a privacy setting called Secret Phrases to automatically filter messages and comments containing offensive words, emoticons, and emojis. These comments are then not easily visible to the account user and are hidden from those who follow the account. A month later, Instagram launched a test that allowed some of its users in the United States, South Africa, Brazil, Australia and the UK to flag “racist language or activity,” according to documents reviewed by The Times.

The test produced hundreds of reports. An internal spreadsheet summarizing the results included a tab titled “Dehumanization_Monkey/Primate”. There were more than 30 instances of comments using bigoted terms and emojis of monkeys, gorillas, and bananas in connection with black people.

In the hours after England lost the European Championship final to Italy on 11 July, racist comments escalated against players who missed penalties – Mr Saka, Mr Rashford and Jadon Sancho. This triggered a “site event” on Facebook, which eventually triggered the site’s type of emergency associated with a massive system outage.

Facebook employees rushed to internal forums to say they were reporting monkey emojis or other derogatory stereotypes. Some employees asked if they could volunteer to rank content or moderate comments for high-profile accounts.

“We get this absolute flow of bile every game, and it gets worse when a black person misses,” one employee wrote on an internal forum.

But reports of racist speech by employees were frequently met with automated messages saying the posts didn’t violate the company’s guidelines. Managers also provided talking points to employees who said Facebook is working “to quickly remove comments and accounts that drive harassment against England’s football players”.

In an internal comment, Jerry Newman, Facebook’s director of sports partnerships for Europe, the Middle East and Africa, reminded employees that the company introduced the Secret Words feature so users can filter out offensive words or symbols. He wrote that using the feature is the responsibility of the players.

“Ultimately, it’s their responsibility to enter Instagram and enter which emojis/words they don’t want to use,” Mr. Newman said.

Other Facebook executives said monkey emojis are not typically used negatively. They added that if the company filters certain terms for everyone, people could miss out on important messages.

Instagram’s CEO, Adam Mosseri, later tweeted that the platform could do better. As an answer He told a BBC reporter that the app “accidentally” flagged some racist comments as “benign”.

But Facebook also defended itself. a blog post. The company said it removed 25 million pieces of hate content in the first three months of the year, while Instagram removed 6.3 million, or 93 percent, before a user reported it.

Kelly Hogarth, who helps manage Mr. Rashford’s off-court activities, said he has no plans to quit social media, which serves as an important channel for fans. Still, he questioned how much burden it had to take on the athletes to watch the abuse.

“At what point does the responsibility come from the player?” he wondered. “I would have no illusions that next season we would be in exactly the same place, having exactly the same conversation,” he added.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *