Frances Haugen was a product manager with Facebook for two years before she became disillusioned with the social media behemoth. On her way out the door, she combed through the company’s internal social network and left with a bevy of bombshell documents. In doing so, she may have finally given Congress a road map to end Facebook’s total lack of accountability.
The documents Haugen absconded with became the basis for The Wall Street Journal’s “Facebook Files” series, which details how the company has long known about the harm its platforms can cause to people and our social fabric. After weeks of secrecy, Haugen revealed herself as the Facebook whistleblower Sunday on CBS’ “60 Minutes.” Tuesday she testified before a Senate hearing — and made me more hopeful than I have ever been that our lawmakers might be up for the task of regulating social media.
The most important message she gave the committee can be summed up like this: Forget the content; focus on the algorithms.
Most of what Haugen told the Senate Commerce Committee’s subcommittee on consumer protection has been detailed in the media already. She confirmed that Instagram, which Facebook acquired in 2012, was well aware that its platform was “toxic for many teen girls.” She repeatedly noted that Facebook is devoting minimal resources in response to its platform’s being used to incite ethnic violence in Ethiopia and other developing countries.
But the most important message she gave the subcommittee can be summed up like this: Forget the content; focus on the algorithms.
For too long, the question of what to do about Facebook has been framed as a choice between limiting free speech or letting violent rhetoric spread unchecked. Instead, Haugen argues, the answer lies in stopping Facebook’s practice of letting computers decide what people want to see.
As it stands, Facebook’s primary algorithm uses “engagement-based ranking” to help determine what pops up in your news feed. In other words, if you like, comment on or share a piece of content, artificial intelligence programs pick up on what makes that content special and finds things it thinks are similar to show you.
In 2018, the company shifted the news feed’s algorithm to focus on what it called “meaningful social interaction” — downplaying news articles and boosting the number of posts from friends, family members and like-minded users at the top of people’s feeds. The idea was to calm things down after the tumult of the 2016 election. The result, as BuzzFeed co-founder Jonah Peretti noted in an email to Facebook, was that Facebook became a demonstrably angrier place, where the worst content bubbled to the top and was shared more aggressively:
Company researchers discovered that publishers and political parties were reorienting their posts toward outrage and sensationalism. That tactic produced high levels of comments and reactions that translated into success on Facebook.
“Our approach has had unhealthy side effects on important slices of public content, such as politics and news,” wrote a team of data scientists, flagging Mr. Peretti’s complaints, in a memo reviewed by the Journal. “This is an increasing liability,” one of them wrote in a later memo.
The point of the current formula is to get people to stay on the site longer by showing users content that Facebook already knows they’ll engage with, whether it’s from a friend or a former schoolmate or an influencer with tens of thousands of followers. “It’s not even for you. It’s so you will give little hits of dopamine to your friends so they create more content,” Haugen explained to the committee.
Facebook co-founder, CEO and Chairman Mark Zuckerberg is aware of all of these factors. And as the owner of almost 58 percent of Facebook’s voting shares, he is uniquely positioned to foster changes in the system he built. Instead, Haugen argued, the company has been fixated on metrics and short-term growth rather than the big-picture aftereffects of its actions.
Unlike many advocates, though, she doesn’t believe Facebook should be broken up. That would just leave its algorithm in charge of what people see and less money to devote to actually solving the problems it faces. She also doesn’t agree with conservatives who would strip Facebook of its liability shield under Section 230 of the 1996 Communications Decency Act.
When asked what she would recommend, Haugen suggested amending Section 230 to make Facebook liable for its algorithm and any that promote hateful or misleading speech to its users. As part of an expanded oversight capacity, she suggested specific research that Congress could demand from the company. Haugen also pitched the senators on a regulatory body that could adjudicate Facebook’s actions and called for the company to make its internal data public by default for independent scholars and academics to review.
Surprisingly, the senators seemed not only receptive to her suggestions but also much more engaged with the issue than normal. They asked pointed questions about algorithmic tweaks and the ways Facebook’s executives made those decisions. When Sen. Ted Cruz, R-Texas, tried to pivot to political censorship, he found himself disarmed by Haugen’s responses. “A lot of the things I advocate for are for changing the mechanisms of amplification, not picking and choosing winners and losers in the marketplace of ideas,” she said. Cruz sounded genuinely interested when he asked her to explain what that meant and how to get more transparency from Facebook.
This sudden savviness on Congress’s part couldn’t come at a worse time for Facebook.
This sudden savviness on Congress’s part couldn’t come at a worse time for Facebook. The New York Times’ Kevin Roose argues that Haugen’s trove revealed the actions of “a company worried that it is losing power and influence, not gaining it, with its own research showing that many of its products aren’t thriving organically.” If legislation targets its “increasingly extreme lengths” to “stop users from abandoning its apps in favor of more compelling alternatives,” that could only multiply Facebook’s problems.
There’s no guarantee that Tuesday’s hearing will be the catalyst for new laws and regulations. And as Monday’s outage showed, Facebook is still integral to the internet’s infrastructure. But thanks to Haugen and the documents she has made public, Congress is finally asking the right questions about how to rein in the power Facebook has accumulated.