Facebook knew that algorithm change increased misinformation, says newspaper – 15/09/2021 – Mercado

Facebook’s algorithm change announced in January 2018 increased misinformation, violence and aggressive content on the social network, according to a report published on Wednesday (15) in the Wall Street Journal. This result was unexpected, but the company would have resisted solving the problem.

The news is part of a series of reports that the American newspaper is publishing based on internal Big Tech documents to which it says it has gained access.

According to the documents, employees had already warned the company that the effects were contrary to expectations. Internal research reportedly found that toxic, violent and disinformation content was very common among shares.

In 2018, already immersed in controversy about the potential damage to users’ mental health and the dissemination of false news, the company said that the purpose of changing the algorithm was to increase interactions between family and friends and reduce passive consumption, which involved content journalistic.

The report states that this was not the only reason, however: at the time, Facebook would be facing a period of low interactions, which would explain the company’s inaction regarding the research findings.

Responsible officials would have proposed a series of measures to reverse the negative effects of the change.

In 2020, one of Facebook’s employees, Anna Stepanov, reportedly wrote to colleagues after a meeting with Mark Zuckerberg, founder and chief executive of the social network, that he was unwilling to accept the proposals if they would reverse the impacts of the social network change. algorithm.

A month ago, the company announced that it was putting less weight on sharing or commenting on political content.

In an interview, Facebook’s vice president of engineering Lars Backstrom said that any algorithm could end up promoting harmful or objectionable content to users.

“Like any optimization, they can find ways to explore and take advantage of it,” he said. “That’s why we have an integrity team that is trying to track them down and figure out how to mitigate them as efficiently as possible.”