Frances Haugen says Facebook’s algorithms are dangerous. Here’s why.

[ad_1]

In his statement, Haugen repeatedly highlighted how these phenomena are worse in non-English speaking regions as Facebook unevenly covers different languages.

“In countries where there are no systems of integrity in the vernacular – and in the case of Ethiopia, there are 100 million people and six languages. “Facebook only supports two of these languages ​​for integrity systems.” “A strategy of focusing on language-specific, context-specific systems for AI to save us is doomed to fail.”

“Investing in non-contextual ways to slow down the platform not only protects our freedom of speech, it also protects people’s lives.”

I explore this further on limitations in a different article from earlier this year. major language models, or LLMs:

Despite LLMs having these linguistic shortcomings, Facebook relies heavily on them to automate content moderation globally. When is the battle in Tigray[, Ethiopia] It first erupted in November, [AI ethics researcher Timnit] Gebru saw the platform falter to deal with the flurry of misinformation. This is symbolic of a persistent pattern that researchers observed in content moderation. Communities speaking languages ​​not prioritized by Silicon Valley suffer the most hostile digital environments.

Gebru noted that the damage did not end here, either. When fake news, hate speech, and even death threats are not eliminated, they are scraped as training data to create the next generation of LLMs. And these models regurgitate these toxic linguistic patterns on the Internet by parroting what they’ve been trained for.

How does Facebook’s content ranking relate to youth mental health?

One of the more shocking revelations than the Journal’s Facebook Files was Instagram’s internal research that found that its platform was worsening mental health among teenage girls. “Thirty-two percent of teenage girls said that when they feel bad about their bodies, Instagram makes them feel worse,” the researchers said in a slide presentation from March 2020.

Haugen also attributes this phenomenon to attendance-based ranking systems, which he tells the Senate today are “resulting in greater exposure of youth to anorexia content.”

“If Instagram is such a positive force, have we seen the golden age of teen mental health in the last 10 years? No, we’ve seen increased rates of suicide and depression among young people,” he continued. “There is a large body of research supporting the idea that social media use increases the risk of these mental health harms.”

I heard in my own reports from a former AI researcher who saw this effect extend to Facebook as well.

The research’s team found that… users who tend to post or interact with melancholic content (a possible symptom of depression) could easily be tempted to consume more and more negative material that risks worsening their mental health.

But as with Haugen, the researcher found that leadership was not interested in making fundamental algorithmic changes.

The team suggested tweaking their content ranking model so that these users stop simply maximizing engagement so that less depressing stuff would be shown to them. “The question for leadership was: If you notice someone is in a vulnerable mood, should we optimize for engagement?” he remembers.

However, anything that reduced participation, even for reasons such as not exacerbating one’s depression, led to a lot of niggle and chatter among the leadership. With performance appraisals and salaries tied to the successful completion of projects, employees quickly learned to let go of what was pushed back and continue working on what was dictated from the top down.

Meanwhile, the former employee no longer lets her daughter use Facebook.

How can we fix this?

Haugen opposes distributing Facebook or repealing Section 230 of the US Communications Code, which protects technology platforms from taking responsibility for the content it distributes.

Instead, he recommends creating a more targeted exemption for algorithmic sorting in Section 230, which he claims will “get rid of interactive sorting.” He also advocates a return to Facebook’s chronological news feed.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *