Whistleblower Gets It Wrong: Facebook Problems Won’t Go Away Until Facebook Goes Away

By Tom Gilson Published on October 5, 2021

Former Facebook employee turned whistleblower Frances Haugen spoke with 60 Minutes last Sunday, telling how toxic Facebook has become, both inside and out. A New York Times editorial sees her revelations as a sign that Facebook may be cracking apart from the inside.

I hope they’re right. Yes, The Stream is on Facebook, and we benefit some from it. To cobble a line from C. S. Lewis, a good Facebook presence is necessary if only to counter all the bad presence there. If Facebook folded I’d consider it a net gain for us and our mission.

So yes, Haugen is right on much of what she had to say. Facebook is bad for the world. It amplifies dissent, it encourages argument, and it reinforces the world’s growing polarization. Her interview last Sunday reveals it’s even worse than we’d thought.

But she’s wrong, too, in at least three major ways. She hopes for a Facebook that’s impossible to start with. If it weren’t impossible, it would still be horrific. It is already. And there’s no realistic hope it will get any better.

“Safety” and “Misinformation”

View the 60 Minutes interview (below) and you’ll hear her saying “safety” over and over again.” She wants a Facebook that won’t cause people harm, either by delivering “misinformation” or by feeding anger and hatred.

She’s wrong already. Freedom of speech was never about safety. It’s about freedom of conscience, which implies the ability to get things wrong. The safety we need is to be able to speak our minds – even if we speak misinformation.

To think some improved formula could manage us all better? That we’re such compute-able creatures, all we need is better computers running our lives?

I put that word in scare quotes earlier, because Facebook’s “unprecedented” systems for managing misinformation are skewed far to the left. We’ve seen it here at The Stream. Articles that are thoroughly researched, thoroughly evidenced, and thoroughly documented, get flagged as “inaccurate” there for not being thoroughly lock-step with liberal talking points.

So the accuracy of Facebook’s accuracy-checking is in doubt, to say the least. Solutions for that are hard to come by. No fact-checker is neutral. Our best and perhaps only recourse is the give-and-take of ideas challenging one another.

Facebook Fosters Anger, and They Know It

In 2018, according to Haugen, Facebook changed its algorithms, the mathematical formulas by which its computers determine what you see on the page, to maximize those challenges. It identified the content each user interacted with most, and pushed it to the top for them. The company’s motive in that wasn’t truth-seeking, though, according to Haugen. It was revenue.

Their own internal research showed users interacting mostly to disagree with each other. That’s exactly what their algorithm was feeding. Facebook was in fact pushing disagreement upon us all. And again, their own research showed the effect:

But of course this was completely predictable. I can’t imagine why they needed research to reveal it.

Users hardening into their positions, groups polarizing against each other, society as a whole becoming angrier. Facebook was able to identify their contribution to anger and hate in society as a whole. They even put numbers on it. Globally, no less.

But of course this was completely predictable. I can’t imagine why they needed research to reveal it. For decades now, social scientists have known the strong influence of negativity bias. People speak up more when they have something negative to say. Customers complain more often, for example, than they offer appreciation. Open-ended survey questions always draw more negative comments than positive.

Therefore any algorithm that recognized and promoted heavy interaction would be sure to promote disagreement, anger, and negativity. I’ll bet you’ve noticed it: Facebook is a less pleasant place to be than it used to be.

The Real Problem That Haugen Misses

Haugen’s take on it seems to be that Facebook needs to fix its algorithms. That’s her one big mistake. She’s misidentified the problem. Let’s review the facts she put on the table to see why.

  • They’re concerned about safety in ways that have nothing to do with freedom of speech.
  • They want to limit misinformation, even though she must know there’s no neutral way to accomplish that.
  • Their revenue model requires lots of interaction on the page, but promoting interaction means promoting negativity.

And there’s also a fourth item:

  • They’re a global company influencing billions of people’s daily lives.

And the question she seems to be asking is this: Isn’t there some improved algorithm by which Facebook could do a better job of managing global human interaction?

The answer is no. I feel my own negativity heating up inside me, and I’m okay with it. Again I say, No: emphatically, strongly, vehemently No! The question itself is disgusting. Can Facebook manage us all better? No! They’ve got no business managing us in the first place! What kind of power-hunger would it take even to think that’s supposed to be their job?

And what algorithm could do it for them, they ask? Again, no! To think some improved formula could manage us all better? That we’re such compute-able creatures, all we need is better computers running our lives? Such insolence! Such rank human ignorance!

The Real Problem at Facebook is Facebook

No algorithm could do what Facebook wants, or even what Haugen apparently thinks they should want. It’s impossible in the nature of what Facebook is: It’s an artificial environment. We have artificial “friends” there. We engage in artificial arguments with artificial people, defined only by the profile they want us to see.

In real debates people put not only their thoughts but their character and reputation on the line. Facebook anonymizes us to the degree that this hardly matters. And people behave accordingly. Character and reputation don’t matter, so how they treat others doesn’t matter, either.

Please Support The Stream: Equipping Christians to Think Clearly About the Political, Economic, and Moral Issues of Our Day.

And it does it with unprecedented global effect, unprecedented global power.

No, the real problem at Facebook isn’t how to manage Facebook better. The real problem at Facebook is Facebook. It’s in its very design. There is no ending the problems without ending Facebook. That’s why, when I read the New York Times piece about Facebook cracking apart from the inside, I thought, “Maybe it will happen. Maybe Facebook will fall apart.” And thinking that, I smiled.

 

Tom Gilson (@TomGilsonAuthor) is a senior editor with The Stream and the author or editor of six books, including the recently released Too Good To Be False: How Jesus’ Incomparable Character Reveals His Reality.

Print Friendly, PDF & Email

Like the article? Share it with your friends! And use our social media pages to join or start the conversation! Find us on Facebook, Twitter, Instagram, MeWe and Gab.

Inspiration
Military Photo of the Day: Stealth Bomber Fuel
Tom Sileo
More from The Stream
Connect with Us