fbpx
SumOfUs protest outside the US Capitol in Washington on Thursday, Sept. 30, 2021.

Blowing the whistle on Facebook is just the first step

We know Facebook is hurting individuals and whole societies; but now we know that Facebook knows it, too.  

On Sunday, October 3, the Facebook whistle-blower, whose trove of internal research documents has made global headlines for weeks, revealed her identity on 60 Minutes, the prestigious American television news show. Frances Haugen, who quit her job at Facebook in May 2021, said “the version of Facebook that exists today is tearing our societies apart,” and that Facebook’s senior executives know this but refused to act. Haugen said she took a trove of internal documents to prove beyond doubt that Facebook’s research team and senior executives know exactly the damage the company inflicts but choose to prioritize advertising revenue. She said; “Facebook has realised if they make it safer, people spend less time on the site, click on less ads, they make less money.” She brought proof of what many outsiders have said for years: Facebook prioritizes revenue over public safety, and Instagram makes children—especially teenage girls – mentally ill.

Who is Frances Haugen? She is a 37 year-old American data scientist, a Harvard Business School graduate, who has worked for Big Tech for 15 years, including for Google, Yelp, and Pinterest. At Facebook, where she had worked for two years, Haugen led a small team working on counterespionage for civilians targeted by hostile states—for example, Taiwanese or Uyghur people being spied on by the Chinese government. Her team of just seven people was expected to provide enough protection for people around the world, and requests for more resources were refused. But Haugen says her personal trigger to blow the whistle was when she lost a friend radicalized by social media into far-right racism.

From early 2021, Haugen assiduously copied tens of thousands of pages of Facebook’s own research about the harms it creates. She has worked with the Wall Street Journal — who reported on Instagram’s harms to teenage girls in September —and will testify to the US Senate Commerce Subcommittee later this week. Just as importantly, Haugen has filed several complaints with the US Securities and Exchange Commission, a regulator for companies. She alleges that Facebook is lying to the public and to shareholders when it says it’s making progress on how it deals with disinformation and hate. One internal Facebook study said “we estimate that we may action as little as three-to- five percent of hate… and 0.6 percent of violence and incitement on Facebook.” Facebook replied a full two weeks after Wall Street Journal reports based on Haugen’s leaked documents, to claim its own research was unreliable. (To be fair, one of the Instagram studies had a tiny sample size of just 40 teenagers.) But it did not address many of the factual claims made about the real, ongoing and fully known harms Facebook is perpetrating against children—and against democracy itself.

Of course, none of the claims made by Haugen’s leaks are new. Independent researchers have pointed out for years that Facebook profits from extremism and hate. Facebook’s 2018 change to the algorithm that decides which content users see was designed to aggressively monetize even more the fact that anger drives clicks. Another whistle-blower, Sophie Zhang, said in April 2021 that Facebook systematically under-resources the teams working to counter state manipulation of the platform by autocrats around the world. If it’s the US presidential election, election integrity is a priority and, as Haugen says, rules and system changes were introduced and heavily resourced, at least temporarily. But for countries like Honduras, Albania and Azerbajan? According to Zhang, not so much. Protecting election integrity from disinformation and organized hate in lower priority countries “felt like trying to empty the ocean with an eyedropper.” And academic researchers have for years provided evidence that Instagram drives eating disorders, suicidal ideation and self-harm. We know Facebook is hurting individuals and whole societies; but now we know that Facebook knows it, too.

Whistle-blowing is vital because it provides documentary evidence not just of harms, but of the culpability of those doing the harm. That’s why Haugen’s tens of thousands of pages are important. No doubt, Senators’ research staff and SEC investigators will be pouring over them, searching for the smoking gun, for whose finger was on the trigger and when. But it is vital that, despite the attraction of focusing on the whistle-blower themselves, we concentrate on what the documents say, and about whom. Attention must go to the independent researchers who have, all along, generated credible evidence of Facebook’s harms. That includes people who have had their access revoked to data Facebook had promised to share, including researchers at NYU just weeks ago, and social scientists around the world whose access to data via APIs was blocked in 2018. Focusing attention on independent researchers is crucial, as they provide context and depth for the claims of harm. Also, every time Facebook has a scandal, it promises to “do better” and be more transparent, but once the media attention relents, it pulls the plug.

Haugen has taken a great risk with her future career, and has provided the documentation that regulators and policymakers need. For this we should be grateful. But she is not the arbiter of what should be done. So far, when asked about solutions, she’s made vague gestures toward “regulation,” but in the context of her belief that “the version of Facebook that exists today is tearing our societies apart.” To this way of thinking, there is a reachable version of Facebook that would do less harm and be OK. This incremental approach is no surprise. Haugen has already worked for 15 years for companies with names that are synonymous with surveillance capitalism. She doesn’t have a problem with the basic business model of extracting people’s data to sell ads. She just has a problem with Facebook being the most egregious of a very bad bunch.

I’ve written before about the Prodigal Tech Bro, the generic guy who got rich working for Big Tech, but then saw the light and left, to decry its failings. The Prodigal Tech Bro converts his social and actual capital into big platforms to question how technology is used. It’s not just galling; it’s dangerous. Centring people who made the problems pushes aside the people— so often women of colour— who’ve been making independent, good faith critiques for years, with little status or money. And it spotlights incremental, milquetoast “solutions” that don’t fundamentally alter the structures and incentives of Big Tech; “The prodigal tech bro doesn’t want structural change. He is reassurance, not revolution. He’s invested in the status quo, if we can only restore the founders’ purity of intent.” Haugen is far, far more courageous than the prodigal tech bro. For choosing to be a whistle-blower she will lose the rest of the career she prepared for and must have planned. She has literally put her money where her mouth is, and I applaud her. But what we need from her now is context, insider knowledge, facts and examples of how Facebook does what it does. We don’t need her to set the frame for what the solutions should be.

I have a lot of empathy for Mark, and Mark has never set out to make a hateful platform. But he has allowed choices to be made where the side-effects of those choices are that hateful polarizing content gets more distribution and more reach.
–Frances Haugen

Insider critiques are uniformly based on the feeling that “Mark” or “Sheryl” either don’t really understand the harms they do, aren’t sufficiently informed about them, or just want to do the right thing but are trapped in a system of wrong incentives. “It’s one of these unfortunate consequences,” Haugen says, “No one at Facebook is malevolent, right? But the incentives are misaligned.” But Facebook created its own incentives from nothing, hiring Sheryl Sandberg to build its data-extractive, advertising-based business model. Its focus on growth above all else is what made its platform an extreme amplifier of disinformation and hate, simply because that’s what drives clicks. And the amount of money the trillion dollar company spends on moderating content and following up on the direct incitements to violence it generates is miniscule.

Facebook does what it does because that is who it is. It doesn’t change because, as Haugen encapsulates; “Facebook has realized if they make it safer, people spend less time on the site, click on less ads, they make less money.” Haugen’s simple, pithy summary of why we are where we are is the starting point for real change. The documents she has leaked and her upcoming Senate testimony will focus attention on the fundamental problems the company created. Now we need to listen to a wide range of people and gird ourselves for a course of radical, outsider-driven change.