Mark Zuckerberg Being Sued for Way Too Little

By Tom Gilson Published on May 27, 2022

Mark Zuckerberg, founder of Facebook (now called Meta) may be in a heap of trouble. Good. For the man who’s already ripped open the technological age’s biggest Pandora’s Box, and is trying to open a whole container-full more, I’m not sure any heap could be big enough.

Karl A. Racine, attorney general for the District of Columbia’s attorney general, is suing him for his part in Facebook’s huge “Cambridge Analyticasecurity breach. Facebook has already paid $5 billion in fines to the Federal Trade Commission, which amounts to about one percent of the company’s current market value.

Cambridge Analytica obtained data on 87 million users through a “psychological profile” app released in 2013 and built on a Facebook platform called Open Graph. It used that data, reportedly, to influence more than one race in the 2016 election. Now Racine is stating on Twitter that Zuckerberg “knew there were privacy issues” and was “personally involved in failures” that led to that breach.

The Problems Are Systemic

That’s bad enough, but there’s more. The problem, in one word, is Facebook. On one level Facebook is a human gathering place, or rather a diverse collection of gathering places. Some of them are very good, just as many gathering places are very good in the real world.

Where it’s bad, though, it’s exceptionally bad, and it’s systemically bad. Anonymity encourages bad behavior, but that’s the least of it. Facebook has mushroomed from a human gathering place to a centralized hub with enormous power to influence communication, an unimaginably massive database of information on individual humans, and a powerful commerce engine. Mark Zuckerberg (“Zuck”) controls all this power with virtually no accountability.

He and some friends founded Facebook in college to track photos of classmates. World domination was surely the furthest thing from their minds, as they played “hot or not” games with the female photos. Now such domination is exactly what Zuck is reaching for — if not in the real world, then in the next best thing, his company’s Metaverse. He intends it to be in the forefront of “basically immersive digital worlds [that will] become the primary way that we live our lives and spend our time.”

Creator, Megalomaniac

Translation: He’s creating a whole new universe for us to live in. Let me repeat that: to live in. You will move there and live there, he thinks. And who’s in charge of this new reality? The answer is the same as in the real universe: Its creator: Zuckerberg, in this case. I’ve said it before: This is not good.

The real universe was created and is ruled by a good, holy, loving, righteous God. His rule extends into the Metaverse, obviously. But for a time, at least, there will be the fiction that it is a new and separate creation. How good is its creator?

Please Support The Stream: Equipping Christians to Think Clearly About the Political, Economic, and Moral Issues of Our Day.

One YouTube commentator has labeled the Metaverse “pure evil ” — and he makes a persuasive case for it. “What you see, hear and feel [in the Metaverse] is entirely controlled by Mark Zuckerberg,” he says. ” Facebook is doing everything it can to create a monopoly on human existence, where society and culture is imported into a universe run by a megalomaniac.”

Strong language, but fitting. If building a whole new reality for everyone to live in isn’t megalomania, what is? Should we trust Zuckerberg with this? Let’s look at some of his company’s track record, focusing on how it manages online content.

Content Management: The African Sweatshop

The first thing everyone knows about content management is that it’s controversial. Sweatshops are not, however. That’s how TIME magazine described one of Facebook’s content management shops in a damning story several weeks ago. Workers in Nairobi are doing supposedly “dignified digital work” for $1.50 an hour. It’s not dignity, it’s brutality. They spend their days filtering out Facebook content the rest of the world shouldn’t be allowed to see. That means they have to see it. All day long.

Conditions are so bad there, the company (a Facebook contractor called Sama) has hired mental health counselors to help workers cope with it. They’ve offered workers other means of easing the pain, including blurring of images or viewing them in black-and-white. Offers like these are empty, however according to TIME, because management gives workers no opportunity to slow down and take advantage of them. One employee says, “I cannot blink. They expect me to work each and every second.”

Content Management: Pornography

Things still slip through. Little things. Things like virtual strip clubs in the Metaverse, where kids join in to take part, along with other “creepy” behavior. Facebook says they’re “looking to make safety improvements.” That’s “safety improvements” in online strip clubs, mind you.

Pornography isn’t safe in any form. It destroys relationships, it destroys careers, it ruins souls. Porn in VR is porn with effects multiplied far beyond the flat images that do so much damage already. Facebook’s answer of “safety improvements” is just as obscene as the environment it’s about.

Content Management: Genocide is Just Fine

Granted, there are foolish people in the world who think porn is just fine. What would they say about genocide? Facebook has approved ads — and accepted payment for them — that call for genocide. This isn’t mere metaphor. It’s actual genocide of an actual people group, the Rohingya in Myanmar. And it’s ads with copy such as, “The current killing of the Kalar is not enough, we need to kill more!”

A “just algorithm” is necessarily an omniscient algorithm. You can lay that dream aside.

 

According to the AP, that ad was purchased on Facebook by a rights group called Global Witness, testing to see if Facebook had reversed an earlier policy with truly deadly consequences. The company failed again. “Facebook failed to detect blatant hate speech and calls to violence against Myanmar’s Rohingya Muslim minority years after such behavior was found to have played a determining role in the genocide against them.”

A “determining role in … genocide.” That’s Facebook’s content moderation for you, helping make the world a better place to live in.

Algorithmic Justice?

The problem is systemic, inherent in Facebook’s very design. In an ideal world, social media content moderation would work like that: accurate, just, and allowing the maximum freedom possible while inhibiting truly dangerous content. This is far beyond humans’ ability to monitor, so a new movement is arising to call for computers programmed to manage it all with “algorithmic justice.”

Algorithmic justice is a great idea. So is world peace. Let’s be realistic, though. If it’s going to moderate misleading content, the algorithm has to know what’s true. A “just algorithm” is necessarily an omniscient algorithm. You can lay that dream aside.

Algorithms are Still Human Creations

Not only that, but algorithms can’t be “just” in the first place. They don’t have that capacity. They’re nothing more than mathematical abstractions that control voltage and logic states on silicon chips. They don’t control anything, not even their own outputs, and they sure don’t care what gets done to whom.

More to the point, they’re human creations, subject to human control. Even if some fantasy world existed where perfect algorithms could give perfect answers, humans could still tweak them to produce the output they wanted instead.

You’ve never seen anyone trying to play God like this before.

That’s essentially what Facebook does. It controls communication according to the message desired by the people in control. Ultimately those decisions rest in Zuckerberg. That gives him power on a scale never before imagined, to influence both communication and commerce on a global scale, and with very little accountability.

Playing God

You’ve heard the phrase often enough: “So-and-so is playing God.” You’ve never seen anyone trying to play God like this before. If the evil is not widely apparent yet, I predict it will be. It isn’t just Zuckerberg, though. This evil is systemic. If he fell — or better yet, repented! — someone else would replace him at the top. If that person went down, there would be another.

Each one would still bear individual responsibility — especially the man who started it all.

The only way to stop it is to dismantle it. God has a way of doing that, going as far back as Babel, in Genesis 11. Congress has the means to do it, at least in the U.S., if it can muster the will and the wisdom.

In the meantime, the best way to protect ourselves is to not entrust ourselves to Mark Zuckerberg. Be cautious with Facebook. And stay out of the Metaverse.

 

Tom Gilson (@TomGilsonAuthor) is a senior editor with The Stream and the author or editor of six books, including the recently released Too Good To Be False: How Jesus’ Incomparable Character Reveals His Reality. Jonathan Gilson contributed research to this article. 

Print Friendly, PDF & Email

Like the article? Share it with your friends! And use our social media pages to join or start the conversation! Find us on Facebook, Twitter, Instagram, MeWe and Gab.

Inspiration
Military Photo of the Day: Stealth Bomber Fuel
Tom Sileo
More from The Stream
Connect with Us