A Word On Behalf of Facebook and YouTube (So to Speak)

Pray for them. Seriously.

By Tom Gilson Published on September 23, 2022

Now here’s a headline I never expected to write: “A word on behalf of Facebook and YouTube.” (It’s almost as unlikely as my last one.) I have little good to say about these companies. In fact, I’ve been collecting material to deliver a completely different word about them: Evil. That’s no overstatement. There is real evil in these sites, and in how they’re run.

Still, these are real people managing these companies, and you have to admit, the job they’re trying to do is impossible. I’m talking about content moderation. It isn’t just a social problem, a worldview problem, or a political problem, it’s an impossibility problem. The Guardian reported last week on YouTube’s struggles to keep their kids’ material age-appropriate. Bottom line: It can’t be done. They can’t even get parents cooperating with them, keeping their kids off channels they shouldn’t be on.

Maybe you think the problem is that these companies are run by bitter leftists who don’t mind using their unelected platforms to manipulate the public mood. You’re right, but only partly. It goes deeper. I’ll state it as simply as I can: The problem is these companies are run by people. And … they’re not run by people. And yet they are run by people. And more people.

Simple? Yes. Impossible to solve, but just that simple to state. Here’s what I mean.

By the People, Not by the People, By the People

The first “run by people” problem is the obvious one. Decision makers at these companies’ corporate level are human beings with the same failings we all have.

But a whole lot of decision-making is also done by computers, whose algorithms determine what gets featured on the page, what shows up when you do a search, what gets ignored, and what gets “moderated” out. That’s the part that’s “not by people.”

Behind all that (“by people,” again), advertisers and other paying parties influence decision-making in a big way. And millions of users (“more people”) upload content by the gazillabytes, all of which these human and computer decision makers have to sort through and “moderate.”

Impossible Standards

What’s lacking in all this is any semblance of a structure to sort it all out. That’s a bigger problem than I’ve seen discussed much. A quick and obviously fanciful thought experiment will bear that out.

Suppose the Walmart corporation decided next week that every one of its 2.3 million employees would report directly to the company’s home office in Bentonville, Arkansas. The greeter who works in Tampa would have no boss there or anywhere in Florida, only in Arkansas. The accounts payable manager at the Mexico City store would have no one there β€” no one in all Mexico, even β€” watching how she spends her time, or she spends her store’s money. Her only boss is in Bentonville.

Impossible Enforcement

It’s patently ridiculous, yet it’s instructive anyway, so let’s follow it a step or two. Suppose they chose to do something like that: How would they go about it? Among other things, they’d have to set global policies defining how employees should work. They would need the means to monitor everyone’s compliance with policy, and they’d have to have ways to enforce discipline where required. (There’s more, but these three carry over nicely to the social media question.)

The monitoring question goes straight to obvious answer: cameras, microphones, and tons of computer power to watch and listen to it all. It’d be a huge technological challenge, but we’ve solved big tech challenges before, right? Wrong β€” at least for challenges like this one. Making a record of sights and sounds is easy, but what about analyzing it? You’d have a problem there β€” the same one YouTube faces.

Please Support The Stream: Equipping Christians to Think Clearly About the Political, Economic, and Moral Issues of Our Day.

Our fictional Walmart might program a theft-detection module into their software. You can bet their employees would have the creativity to get around it, though, much like the YouTubers I mentioned above. What do you do? Build a system to detect the things you aren’t detecting? Maybe a whistleblower hotline could do that β€” but you’d need a system to detect malicious, false whistle-blowing.

Walmart isn’t that stupid, but you do begin to see the kind of mess in which YouTube and Facebook find themselves. They have no structure. Just policy-makers and computers, and a few humans β€” some of them reportedly badly mistreated β€” trying keep up with it all.

The Policy Problem

We jumped right on past the policy problem, though. It’s no easier than the last problem. Suppose, for example, a male employee at our fictional Walmart crosses his legs across his knees while sitting in the break room. Is he relaxing comfortably or should the computer ramp up and fire him on the spot? The answer may not be so obvious.

Across much of the Middle East, to show someone the bottom of your shoe is to send him a grave insult. If it happens that this knee-crossing employee is an American, and there’s a recent immigrant from the Middle East sitting next to him, you’ve got to decide whether he’s sitting there casually, innocently, and ignorantly, or whether he’s intentionally delivering a massive insult. Remember: No local bosses. The computers and the team at Bentonville have to sort it out. With policies that will make sense in millions of potential situations. Good luck with that.

The Human Nature Problem

The analogy to Walmart may seem silly, but it’s instructive: Some things can’t be done, not with policies, and not with technology. And I haven’t even gotten to the biggest problem: human nature.

Look at the mess Facebook and YouTube have created for themselves. (They may not see it as a mess, but it absolutely is.) Both platforms wield enormous power in global culture. Neither of them is accountable to anyone except their paying customers and stockholders. Every paying customer is an influence-peddler, buying time and space to persuade viewers to purchase something, vote for someone, support something. Every stockholder wants them to make money.

Pray for the people running these companies. Seriously. Some of them undoubtedly want a way out, and can’t imagine what that could be.

The same goes for other social media, but these the true giants, the Leviathans of the business. They and their parent corporations, Alphabet (Google) and Meta (Facebook, Instagram, and more) hold massive power. They can use it to sway entire countries’ political decisions, and they’ve been doing it for at least  twelve years. They control massive amounts of money, to the extent that Meta’s Mark Zuckerberg was able to absorb a $70 billion loss in net worth and yet remain one of the 20 richest people in the world.

Power, Accountability, Integrity?

Internal staffers may not be as rich, but they have powerful motivations pushing them to fit in to their own corporate value system and to impose it on the world through content moderation policies. They have all this power, with virtually no accountability. To keep one’s integrity under such conditions is humanly possible. “With Christ all things are possible,” but these corporations don’t exactly keep Him at the center of all they do.

These companies have money, they have power, and they have virtually no accountability except to advertisers, that is, money suppliers, of whom some are undoubtedly not merely innocent onlookers. Keeping your integrity in those conditions has got to be next to impossible.

A Word of Sympathy (Not to Be Taken Too Seriously)

So pity these poor social media companies. If fair, just, and equitable content moderation is their task, it’s literally an impossible one. I almost feel for them, don’t you?

What then is the rational thing to do when you’re trapped in a literally impossible situation? When it’s not only putting the squeeze on you, but doing harm millions of others through your inability to manage a situation with justice and integrity?

The rational answer is to get out. Leave. Quit. Shut it down. Close up shop, and say goodbye. It may not be easy. The monster will still turn around and try to eat you. But maybe then you’ll realize you’d better turn to Christ, with whom all things are possible, including rescuing people from the trap of their own power.

Pray and Decide

That’s for the people running these companies. Pray for them. Seriously. Some of them undoubtedly want a way out, and can’t imagine what that could be. God is big enough to answer.

Meanwhile the rest of us have a choice to make regarding social media usage. We know these sites are never going to get it right, so what shall we do? At least three of my closest family members have left Facebook for keeps. I respect that decision, in fact I look at it almost longingly. I could be very happy living a Facebook-free life again.

Some of us stay so we can inject some modicum of truth and sanity into the mess. That’s why The Stream remains on Facebook, and why I still keep a (mostly inactive) personal account there. If that’s your choice, great. Do what good you can do through it. But manage your expectations. It’s not going to get any better.

 

Tom Gilson (@TomGilsonAuthor) is a senior editor with The Stream and the author or editor of six books, including the recently released Too Good To Be False: How Jesus’ Incomparable Character Reveals His Reality.

Like the article? Share it with your friends! And use our social media pages to join or start the conversation! Find us on Facebook, X, Instagram, MeWe and Gab.

Inspiration
The Good Life
Katherine Wolf
More from The Stream
Connect with Us