Mark Zuckerberg, founder and CEO of Facebook
Frances Haugen’s testimony before the US Senate on Tuesday 4 marked a turning point in Washington’s relationship with Facebook. She, who was responsible for the issue of civic disinformation on the platform, didn’t bring the politicians anything they didn’t know. What she brought were clues about the way forward to regulate. And that has immense value. From this testimony, by the way, at least two very important conclusions emerge.
The first is that Haugen helped Republicans and Democrats find a common path. In theory, what she presented was a whistleblower. Within the company, she occupied a management responsible for understanding and offering solutions to the problems that Facebook causes in society.
The issues have been mapped out and are not new. Journalists who work with misinformation, sociologists investigating the impact of the digital on people’s lives or political scientists who study the intersection between radicalism and the internet already knew them well and in depth. The difference is that, now, we know that within Facebook the same conclusions were drawn.
We also know that Facebook knows how to solve at least partially these problems. The question of how networks incite political hatred, for example. Or about heavy impact on teenage girls’ self-image. The solution, however, would have a lower growth cost. And, this, Mark Zuckerberg does not want. The solution has to appear at no cost to Face’s growth.
But real problems are one thing, politicians’ problems are quite another. Democrats and Republicans see Facebook as a problem. Democrats see hatred and misinformation as what needs to be resolved. For Republicans, the issue is different — censorship of right-wing voices on the networks. The result is that they don’t understand each other and, thus, no regulation can get out.
For Haugen offered the way out: transparency. Companies like Facebook have to make their algorithms transparent. Why did this post come to you and not another one? What criteria did the program follow to choose what we see more or less often?
For the first time in a long time something like that was seen in the US Senate: Republicans and Democrats following the same line of inquiry, interested in the same angles, realizing that something new had come up there. Algorithm transparency, if we all understand how algorithms reach their conclusions, solves the problem for both groups.
In fact, the potential for laws to enforce transparency could be even greater. If the explanation is clear, those who use the networks will understand that they act to make us glazed over, to reach deep into our brains those unconscious points that trigger something similar to chemical dependency. The impact on their images would be so great that the companies themselves would have to compromise. To let go. To allow us to make up time in our lives away from the addictive little screens. And that leaves us in this constant state of irritation with everyone who is different, who thinks differently.
But there is a second inevitable conclusion. For the good of the Facebook company, its founder has to leave. The drive to grow that Mark Zuckerberg imposed in the early years of Face’s life created a formidable and powerful company. Zuck is brilliant. But the kind of CEO a company needs when it has 3.5 billion customers is different from the CEO of a startup.
The problems the company Facebook is causing in the world will impose a heavy toll on its image for years to come. Bill Gates was slow to understand that he had to leave Microsoft. But it was good for the company he created. Zuck hurts Face in the long run.
*HE IS A JOURNALIST