Informant Haugen says Facebook is making online hate worse

[ad_1]

LONDON (AP) – Facebook whistleblower Frances Haugen He told British lawmakers on Monday that the social media giant is fueling online hatred and extremism, failing to protect children from harmful content and lacking any incentive to address the issues, giving impetus to efforts by European governments working on tighter regulation of tech companies.

Duration him how much the testimony resonated he He told the U.S. Senate this month: him The face-to-face appearance drew intense attention from a British parliamentary committee that was far ahead of drafting legislation to curb the power of social media companies.

It came the same day Facebook would announce its latest earnings and the Associated Press and other news outlets began publishing thousands of pages of stories based on internal documents. he Obtained.

Haugen He told a committee of UK lawmakers that Facebook Groups are fueling online hate, and that algorithms that prioritize engagement are taking people with mainstream interests and pushing them to extremes. The former Facebook data scientist said the company may add moderators to prevent groups of a certain size from being used to spread extremist views.

“Admittedly, it makes the hate worse” he said.

Haugen said he “Shocked recently to hear that Facebook wants to double the metadata base and that they are going to hire 10,000 engineers in Europe to work on the metadatabase” Haugen he said, referring to the company’s plans for an immersive online world that it believes will be the next big internet trend.

“Do you know what we could do about safety if we had another 10,000 engineers?” said. he said.

Facebook said it wanted regulation for tech companies and was happy with the UK taking the lead.

“While we have rules against harmful content and publish regular transparency reports, we agree that we need regulation for the entire industry so businesses like ours don’t make these decisions on their own,” Facebook said in a statement Monday.

It noted the $13bn (£9.4bn) investment in safety and security since 2016 and claimed to have “nearly halved” the amount of hate speech in the last three quarters.

Haugen He accused Facebook-owned Instagram of failing to prevent children under the minimum user age of 13 from signing up for accounts, saying it didn’t do enough to protect children, for example, from content that made them feel bad about their bodies.

“Facebook’s own research defines it as an addict’s narrative. The kids say, ‘This makes me miserable, I feel like I have no ability to control my use, and I feel like I’ll be ostracized if I leave.’

Last month, the company delayed its plans for a kids version of Instagram for under 13s to address concerns about the vulnerability of young users. Haugen said he She worried that it wouldn’t be possible to make Instagram safe for a 14-year-old and “I sincerely doubt it can be made safe for a 10-year-old.”

HE He also said that Facebook’s moderation systems are worse at capturing content in languages ​​other than English, which is even an issue in the UK because it’s a diverse country.

“These people also live in the UK and are fed dangerous, radicalizing misinformation.” Haugen said. “And therefore, language-based coverage is a national security issue, not just a good thing for individuals.”

printed whether he believes Facebook is fundamentally bad, Haugen He objected and said, “I can’t see people’s hearts.” Facebook is not bad, it is negligent, he suggested.

“He believes in a flat earth and will not accept the consequences of his actions,” pointing to the huge one-story, open-plan corporate office as the epitome of philosophy.

HE He claimed that there is a culture at Facebook that discourages ordinary employees from raising their concerns to their top executives. For many, including CEO Mark Zuckerberg, it’s the only place they work that contributes to the cultural issue, he said.

happened HaugenThen he appeared in front of the deputies for the second time. he Testified about the danger in the USA he He says the company poses for everything from harming children to inciting political violence to fueling misinformation. Haugen the aforementioned internal research documents he secretly copied before leaving him The job at Facebook’s civil integrity unit.

documents, which Haugen Filed with the U.S. Securities and Exchange Commission, it claims Facebook prioritizes safety over profit and hides its own research from investors and the public. Some stories based on the files have already been published, revealing how Facebook hesitated to curb internal turmoil and divisive content in India after being blinded by the January 6 US Capitol riot, and more.

Representatives from Facebook and other social media companies plan to speak with the British committee on Thursday.

UK lawmakers are drafting an online safety bill calling for a regulator to be set up to hold companies accountable for removing harmful or illegal content, such as terrorist material or images of child sexual abuse, from their platforms.

“This is a moment like Cambridge Analytica, but probably bigger as I think it opens a real window into the psyche of these companies,” said Damian Collins, the MP who chairs the committee, before the hearing.

He referred to the 2018 fiasco that involved data mining firm Cambridge Analytica, which gathered information on 87 million Facebook users without their consent.

Haugen He is scheduled to meet with European Union officials in Brussels next month, where the bloc’s executive commission updates its digital rulebook to better protect internet users by holding online companies more accountable for illegal or dangerous content.

Under UK rules, which are expected to go into effect next year, Silicon Valley giants face a final penalty of up to 10% of their global revenue for any violation. The EU proposes a similar penalty.

Sign up for Daily Newsletters

Copyright © 2021 Washington Times, LLC.



[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *