Nick Clegg, Facebook’s vice president for global affairs, told CNN’s State of the Union program on Sunday that the company must introduce new measures in its apps to keep teenagers away from “harmful content.”
The executive also expressed openness to the idea of allowing regulators access to social media algorithms used to amplify content. The statements come as US lawmakers analyze how the social network and its subsidiaries, such as WhatsApp and Instagram, affect the mental health of young people.
The scandal came to light with the accusation of a former employee who told The Wall Street Journal that Facebook protected celebrities from content rules, which the company knew Instagram was “toxic” to teenagers. The episode became known as “The Facebook Files”.(read more below)
- Share this news on WhatsApp
- Share this news on Telegram
In the CNN interview, Clegg says the algorithms “have to be enforced, if necessary, by regulation, so that people can compare what our systems say they should do with what actually happens.”
“We’re going to introduce something that I think will make a considerable difference, which would be our systems realizing that a teenager is seeing the same content over and over again and it’s content that might not be conducive to their well-being, and we’re going to encourage them to take a look. for other content,” said Clegg.
Last week, even Facebook founder Mark Zuckerberg used the social network to defend against accusations that the site prioritizes profits over its users’ sensitive information and content, harms children and undermines democracy.
The businessman said that Facebook is deeply concerned about issues such as safety, well-being and mental health” and complains about the supposed “false image being painted of the company”.
“At the heart of these accusations [que está a ideia de que priorizamos o lucro em vez da segurança e do bem-estar] This simply is not true.”
The executive has pledged to do more research on the subject and share it with the public when they are completed.
“Instead of ignoring this, technology companies need to create experiences that meet the needs [das crianças] while keeping them safe”.
Frances Haugen, former Facebook employee, in an interview with American broadcaster CBS News — Photo: CBS News/60MINUTES via REUTERS
Former Facebook employee Frances Haugen, 37, worked as a product manager at the company and was responsible for election-related projects. She revealed her identity last Sunday (3) in an interview with the American broadcaster “CBS News” during the program “60 Minutes”.
It was from the documents she obtained that the “Wall Street Journal” published reports in mid-September indicating that Facebook protected celebrities from content rules, that the company knew Instagram is “toxic” for teenagers and that the company’s response to employee concerns about human trafficking was often “weak.”
During an interview with CBS News, Haugen accused Facebook of “putting profits above safety” and said it “acted to help encourage change in the social media giant, not to arouse anger.”
“Facebook makes more money when you consume more content. People like to get involved with things that provoke an emotional reaction. And the more you feel angry, the more you interact, the more you consume,” said Haugen.
A trained computer engineer, Haugen has worked for other technology companies such as Google and Pinterest and specializes in creating algorithms that decide what people will see in their feeds. According to her, Facebook is “substantially worse” than anything she’s ever seen before.
Since September, when the scheme denounced by Haugen was exposed by the WSJ, Facebook’s shares reap a drop of about 10%.
Facebook denies accusations
Facebook reacted to reports in the “Wall Street Journal”. Nick Clegg, Facebook’s vice president of global relations, posted a series of tweets Sept. 18 pointing to what he called “mischaracterizations” of stories.
According to him, claims that Facebook would deliberately and systematically ignore inconvenient research are “false”. The network also said that the leaked documents were released to the public “without enough context” and decided to publish the materials with “annotations”.
To g1, Facebook said: “Every day, our teams work to protect the ability of billions of people to express themselves openly while keeping our platform a safe and positive place. We continue to make significant improvements to combat misinformation and harmful content on our services. To suggest that we encourage harmful content and do nothing about it is simply not true.”