[ad_1]
SAN FRANCISCO — In 2019, Facebook researchers began a new study on the Like button, one of the main features of the social network.
They looked at what people would do Facebook removed prominent thumbs up icon and other emoji reactions from posts on the photo-sharing app Instagram, according to company documents. Researchers have found that buttons sometimes cause “stress and anxiety” in Instagram’s youngest users, especially if posts don’t get enough Likes from friends.
However, researchers discovered that when the Like button is hidden, users interact less with posts and ads. It also didn’t alleviate teens’ social anxiety, and younger users didn’t share as many photos as the company thought, leading to mixed results.
Mark Zuckerberg, Facebook CEO and other executives have discussed hiding the Like button for more Instagram users, according to the documents. Finally, a bigger test was done only in a limited capacity To “create a positive press narrative” around Instagram.
The research on the like button was an example of how Facebook questions the core features of social networks. As the company faced crisis after crisis incorrect information, privacy and hate speechA key issue was whether the basic way the platform works was faulty – essentially, the features that make Facebook Facebook.
Alongside the Like button, Facebook also reviewed the share button, which allows users to instantly disseminate content posted by others; her groups feature used to create digital communities; and other tools that define how over 3.5 billion people behave and interact online. The research, laid out in thousands of pages of internal documents, highlights how the company has operated repeatedly. grappled with what he created.
What the researchers found was generally far from positive. Again and again, they found, among other effects, that people were abusing the essential properties or that those properties were increasing the toxic content. In an August 2019 internal memo, several researchers said it was Facebook’s “core product mechanics” – namely, the fundamentals of how the product works – that allowed misinformation and hate speech to flourish on the site.
“The mechanics of our platform are not neutral,” they concluded.
The documents, which include slide decks, internal discussion topics, charts, notes, and presentations, do not show what actions Facebook took after receiving the findings. In recent years, the company has made it easier for people by changing some features. hide posts they don’t want to see and they close political group recommendations to reduce the spread of misinformation.
However, the basic operation of Facebook, a network where information can spread rapidly and people can accumulate friends, followers and Likes, ultimately remains largely unchanged.
Some current and former administrators said that many important changes to the social network have hindered its growth and service to engage users. Facebook is worth more than $900 billion.
“There’s a gap between being able to have pretty frank conversations inside Facebook as an employee,” said Brian Boland, a Facebook vice president who left last year. “Actually, making change can be much more difficult.”
The company documents are part of the Facebook Documents, a cache provided to the Securities and Exchange Commission and Congress by a representative attorney. Frances Haugen, a former Facebook employee who became an informant. Ms. Haugen previously documents Wall Street Journal. This month, a congressional staff member provided redacted statements to more than a dozen other news outlets, including The New York Times.
A Facebook spokesperson, Andy Stone, criticized the articles based on the documents and said they were built on a “false premise”.
“Yes, we are a business and we make a profit, but the idea that we do this at the expense of people’s safety or well-being misunderstands where our own business interests lie,” he said. He said Facebook has invested $13 billion in keeping people safe and has hired more than 40,000 people, adding that the company is calling for “updated regulations where democratic governments set industry standards to which we can all adhere.”
Inside Message This month, Mr. Zuckerberg said it was “highly unreasonable” for the company to prioritize harmful content because Facebook’s advertisers don’t want to buy ads on a platform that spreads hate and misinformation.
“On the most basic level, I think most of us don’t recognize the wrong picture of the company being painted,” he wrote.
Fundamentals of Success
When Mr. Zuckerberg founded Facebook in his Harvard University dorm room 17 years ago, the site’s mission was to connect people on college campuses and bring them into digital groups with shared interests and locations.
Growth exploded in 2006 when Facebook introduced News Feed, a central stream of photos, videos, and status updates posted by people’s friends. Over time, the company added more features to keep people interested in spending time on the platform.
In 2009, Facebook introduced the Like button. The tiny thumb sign, a simple indicator of people’s preferences, has become one of the most important features of the social network. The company has allowed other websites to use the Like button so users can share their interests on their Facebook profiles.
This gave Facebook information about the activities and emotions of people outside of its site so it could better target them with ads. Likes also meant that users would like to see more in their News Feed so they spend more time on Facebook.
Facebook has also added the groups feature where people join private communication channels to talk about specific interests, and pages that allow businesses and celebrities to gather large fan bases and post messages to those followers.
Another innovation was the share button that people used to quickly share photos, videos, and messages posted by others to their News Feed or elsewhere. An automatically generated recommendation system also suggested new groups, friends, or pages for people to follow based on their previous online behavior.
But according to the documentation, the features had side effects. Some people started using Likes to compare themselves to others. Others used the share button to quickly spread information, so false or misleading content went viral within seconds.
Facebook said it is conducting internal research to partially identify issues that can be fixed to make its products safer. Adam MosseriThe head of Instagram said that research into users’ well-being has led to investments in anti-bullying measures on Instagram.
Jane Lytvynenko, a senior researcher on social networks and misinformation at the Harvard Kennedy Shorenstein Center, said Facebook cannot change itself to become a healthier social network at a time when many issues are rooted in core features.
“When we talk about the like button, the share button, the News Feed, and their power, we’re essentially talking about the infrastructure on which the network is built,” he said. “The crux of the problem here is the infrastructure itself.”
Self Examination
As Facebook’s researchers researched how their products worked, alarming results began to pile up.
In a study of groups in July 2019, researchers tracked how members in these communities could be targeted with false information. The starting point, the researchers said, is people known as “guest whales,” who send invitations to others to join a private group.
The study said it was effective in getting these people to join thousands of new groups, so communities ballooned almost overnight. Then, according to the study, invited whales may spam groups with posts that promote ethnic violence or other harmful content.
Another 2019 report looked at how some people gained huge followers on their Facebook pages, often using posts about cute animals and other harmless topics. But once a page reached tens of thousands of followers, the founders sold it. According to the research, buyers then used the pages to show followers false information or politically divisive content.
According to the documents, as researchers studied the Like button, administrators considered hiding the feature on Facebook as well. In September 2019, a small experiment in Australia removed Likes from users’ Facebook posts.
The company wanted to see if the change would reduce pressure and social comparison among users. This can encourage people to post to the network more often.
However, after the Like button was removed, people did not share any more posts. Facebook chose not to expand the test, saying “likes are extremely low on a long list of issues we need to resolve”.
Last year, company researchers also evaluated the share button. In a September 2020 study, a researcher wrote in News Feed that the button and so-called resharing aggregators that automatically create sets of posts that have already been shared by people’s friends are “designed to attract attention and encourage engagement.” ”
But the researcher said that if left unchecked, the features “could serve to reinforce bad content and resources,” such as bullying and borderline nudity posts.
This is because the features make people less shy about sharing posts, videos, and messages with each other. In fact, users are three times more likely to share any type of content from repost aggregation units, the researcher said.
One post that went widely in this way was an undated message from an account called “Angry Patriot.” The post informed users that people protesting police brutality “targeted a police station” in Portland, Ore. Hundreds of hateful comments poured in after it was shared via repost aggregators. This was an example of a “hate trap”. said the researcher.
A common thread in the documents was how Facebook employees advocated changes to the way the social network worked, and often blamed administrators for blocking them.
In an August 2020 internal post, a Facebook researcher criticized the recommendation system that recommends pages and groups for people to follow, saying it “could very quickly lead users to conspiracy theories and groups”.
“Over the fears potential public and policy stakeholder responses, we on purpose Exposing users to integrity risks,” the researcher wrote. Conspiracy theory movements such as QAnon and the anti-vaccine and Covid-19 conspiracies “during the time we hesitated, I have seen people from my hometown go further down the rabbit hole”.
“It was painful to observe,” the researcher added.
Davey Alba, Sheera Frenkel, Cecilia Kang and Ryan Mac contributed to the reporting.
[ad_2]
Source link