[ad_1]
For weeks, Facebook has been rocked by revelations that sparked a storm of criticism from lawmakers, regulators and the public.
Reports by The Wall Street Journal, from research documents provided by an informant, have put Facebook under the spotlight. These reports showed how Facebook She knew Instagram was exacerbating her body image issues among young people, among other issues.
The informant is made public.
Informant, Frances Haugen, Opened to the public During an interview on “60 Minutes” in early October. On October 5, Ms. Haugen testified before the Senate subcommittee for more than three hours. He said Facebook knowingly concealed disturbing research about how teens feel worse about themselves after using their products and how they are willing to use hateful content on its site to keep users coming back. In his statement, he urged lawmakers to request more documents and internal investigations, suggesting that the documents he provided were just the tip of the iceberg.
After Ms. Haugen testified, managers publicly questioned her credibility and dismissed her accusations. But internally, they’ve tried to position their stance in a way that sticks to the goodwill of more than 63,000 employees and addresses their concerns.
Leaked documents reveal internal struggles.
Reporters have since covered more internal documents from the company that owns Instagram and WhatsApp in addition to the core Facebook social network. For example, documents related to Instagram, retention, engagement and attracting young users.
Other documents raise questions about Facebook’s role in election misinformation and the pro-Trump attack on the Capitol on January 6. The company documents show how aware Facebook is of extremist movements and groups trying to polarize American voters ahead of the election. vote. Employees believed According to the documentation, Facebook could have done more.
The challenges are global.
The problems are even bigger in India, Facebook’s largest market. Internal documents show combating misinformation, hate speech and celebrations of violence. Dozens of studies and memos written by Facebook employees are clear evidence of one of the most serious criticisms leveled at a company worldwide by human rights activists and politicians: It enters a country without fully understanding its potential impact on local culture and politics, and has to take action after problems arise. cannot distribute resources.
The latest revelations, released Monday morning, show internal research undermining the heart of the social networks Facebook has revolutionized – “likes” and sharing. According to the documents, researchers have repeatedly determined that, among other effects, people abuse key features or that those features increase toxic content. In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics” – namely, the fundamentals of how the product works – that allowed misinformation and hate speech to flourish on the site.
Without the government-mandated transparency Facebook can present a false picture of its efforts to tackle hate speech and other extremist content, Mrs. Haugen told the British Parliament. The company says its AI software has caught more than 90 percent of hate speech, but Ms. Haugen said the number is less than 5 percent.
“They’re very good at dancing with data,” he said.
[ad_2]
Source link