‘Facebook prioritized growth over security,’ says former employee – Link

Frances Haugen worked as a product manager on Facebook's civic disinformation team

Frances Haugen worked as a product manager on Facebook’s civic disinformation team

The day after the blackout that took down Facebook apps, including Instagram and WhatsApp, was no less tense for Mark Zuckerberg’s company. This Tuesday, 5, Frances Haugen, a former Facebook employee, participated in a hearing at the United States Senate, in which she explained the company’s logic of prioritizing growth over security and shed light on regulatory paths for the networks social. The testimony comes after a series of reports reveal the company’s negligence in content moderation, which is heading to be Facebook’s biggest crisis since the Cambridge Analytica scandal in 2018.

Frances, who worked as a product manager on Facebook’s Civic Integrity team, was responsible for bringing to the public Facebook’s internal surveys that show the company has neglected content moderation on its platforms. The documents were revealed in recent weeks by the American newspaper Wall Street Journal and also sent to the Securities and Exchange Commission (SEC), the regulatory body for listed companies in the United States.

Part of the complaints involves Facebook’s relationship with children and teenagers. One of the leaked surveys found that 1 in 3 girls who felt bad about their bodies were even worse when accessing Instagram. This Tuesday’s audience, called “Protecting Kids Online”, focused precisely on the company’s findings regarding the effect of Instagram on young users.

Last week, the same US Senate subcommittee heard Antigone Davis, the social network’s director of security, who said the poll results had been misinterpreted. Marsha Blackburn, a Republican senator, questioned how the Facebook executive downplayed the importance of polls in her testimony. “I would like to remind you that the surveys were not by third parties, they were internal. They knew what they were doing,” he said at Tuesday’s hearing.

Senators Marsha Blackburn and Richard Blumenthal during hearing on Tuesday

Senators Marsha Blackburn and Richard Blumenthal during hearing on Tuesday

In addition to asking questions about how Facebook handles data, manages algorithms, and addresses security issues, senators asked practical questions about Frances’ view of US legislation such as Section 230, which guarantees rules on free speech and moderation of Internet content, the Internet Child Privacy Protection Act and data protection regulations. Section 230, specifically, has become a target for the two main US political parties in recent years: the Republican believes that the networks excessively bar content, while the Democrats point to a certain neglect of the platforms regarding the content that circulates there.

“We will not fix this without the help of Congress,” Frances said at the hearing. “These problems can be resolved. I believe in Facebook’s potential to connect without hurting our democracy. But Facebook won’t make that change on its own. I accepted the personal risk of going public because I believe we still have time to act.”

The former employee’s suggestions for regulation could become the first step towards tougher regulation for technology companies – the United States still does not have privacy regulations along the lines of the General Data Protection Law of Brazil (LGPD) and the Regulation General Data Protection in Europe (GDPR).

Among the suggestions for regulation, Frances defended that Facebook raise the minimum age of platforms to 16 or 18 years. About Section 230, the former employee argued that the legislation should pay more attention to the functioning of algorithms than to content on the platform, as the technology giants manage to have control over the systems. In addition, one of the proposals was to create an independent government agency with experts to study or audit the impact of large digital platforms.

“Adjustments to outdated privacy regulations will not be enough,” Frances said. In her opinion, a starting point would be full access to the platform’s data for external research: “Based on this, we can build sensible rules and standards to deal with consumer harm, illegal content, data protection, anti-competitive practices, systems algorithms and much more.”

“Today, no regulator has a menu of solutions to fix Facebook because the company didn’t want them to know enough about what’s causing the problems,” Frances stressed. “Otherwise, there would be no need for a whistleblower.”

Asked about the need to break up Facebook, which has encompassed companies such as Instagram and WhatsApp in the last decade, she said she was against such measures. “The problems involve algorithmic design and artificial intelligence,” Frances said, signaling that executives could continue to make the wrong decisions even if Facebook and Instagram were separate companies.

“If Facebook is separated from Instagram, most of the money invested in advertising is likely to go to Instagram, but Facebook will continue to be that Frankenstein that is putting lives in danger around the world,” he said.


On the 27th, Facebook announced that it was taking a break from its social network project for children, Instagram Kids, after the repercussion of the documents leaked by Wall Street Journal and pressure from federal regulators. The project, which aims to build a specific platform for young people under the age of 13, will probably continue underway, says Frances, as it is a way to guarantee the growth of children within the networks in the future.

“I would be surprised if they stopped the development of Instagram Kids. To maintain ‘success’, they need to ensure that the new generation is as engaged as the current generation is, and for that they want to hook these kids early on,” he explains.

During the session, in an attempt to question Frances’ authority in the discussion, Andy Stone, one of Facebook’s directors of communication policy, posted on Twitter that the former employee did not work directly with child safety, polls or Instagram. .

Aware of the tweet, Republican Senator Marsha Blackburn read the message in court and suggested that Stone take part in the case and give his version of events. “I’ll just say this to Mr Stone: if Facebook wants to discuss child targeting and privacy violations, I’m extending an invitation to you to step forward, take an oath and testify before this committee.”

Atmosphere in the US Senate was friendly and receptive to the former Facebook employee

Atmosphere in the US Senate was friendly and receptive to the former Facebook employee

Another director of communication policy at the company, Lena Piesch, also reiterated Frances’ work on Facebook outside the areas involved in the study and that the former employee had spent less than two years at the company.

“We don’t agree with Frances’ characterization on many issues she testified about. Despite all this, we agree on one thing: it’s time to start creating standard rules for the internet. It’s been 25 years since the rules for the Internet were updated, and instead of waiting for the industry to make social decisions that belong to lawmakers, it’s time for Congress to act,” Lena said in a statement.

In an interview with the American channel CNNMonika Bickert, the platform’s vice president of content policy, said Frances’ allegations about Facebook igniting discussions on her social networks were not true.

“We do the opposite actually. If you look at our transparency center, you can see that we’ve reduced the visibility of artificial engagement and click baits. Why would we do that? One big reason is the long-term health of our services. We want people to have a good experience”.

At the end of the hearing, Frances promised to hold talks with lawmakers.

War declared on Zuckerberg

Although Mark Zuckerberg is fleeing the spotlight of controversy, senators highlighted the participation of the founder of Facebook in the company’s conduct regarding surveys and neglect of its users. Senator Blumenthal said that the policy of not taking responsibility, not apologizing and not giving explanations is detrimental to users and that since Zuckerberg is the company’s president, he should take his part in the actions that have taken place since the survey was released. .

Democratic senator Edward Markey also sent a message directly to the president of Facebook, suggesting the presence of the businessman to testify to US senators. “Here’s my message to Mark Zuckerberg: Your time to invade our privacy, promote toxic content and hunt down children and teenagers is over.”

In her testimony, Frances also stated that Zuckerberg has enough power to be aware of the status of the company’s algorithms and that the president is in a position to approve or disapprove of any decision taken by the company to change social media policy or ignore the findings of the study. internally.

“There is no one currently holding Mark responsible except himself,” she said.


Unlike the combative testimony of Antigone Davis and other hearings with CEOs of tech giants, the mood in the US Senate was friendly and welcoming with the former Facebook employee on Tuesday.

Senator Edward Markey even called Frances a “heroine”. While Antigone was interrupted several times over the past week to answer “yes or no” questions, senators told the former Facebook employee that “it would be great” if she answered assertively. In addition, Congress was interested in Frances’ views on regulatory changes.

Although the documents revealed by the ex-employee encompass several topics linked to content moderation and algorithms, the Senate hearings have been focusing mainly on the topic of children and adolescents on the social network.

The dossier of documents also shows how Facebook has been negligent in matters such as drug trafficking, in addition to allowing celebrities and public figures such as former US President Donald Trump to violate the platform’s rules of use. The material shows that even under the rules of the platforms, prohibiting content with hate speech or attacks, some people, purposely, never go through the scrutiny of moderators.