Don't Use Social Media

I urge you all to stop using social media such as Facebook, Twitter, Instagram and the like, or at least to use them sparingly and for specific purposes. There are detailed reasons in the Netflix documentary The Social Dilemma as well as Jaron Lanier's book Ten Arguments for Deleting Your Social Media Accounts Right Now. Below are what I believe to be the most compelling arguments.

The Goal of Social Media Companies

Social media companies have one goal: To make profit. That's all.

The Technology behind Social Media Companies

Social media companies use a vast array of computing power coupled with machine learning algorithms to optimize their goal. That is, they throw vast computing resources at a single goal: To make profit.

Machine learning technology has become vastly more powerful over the last 10 to 15 years than it was in the early pioneering days of "Artificial Intelligence", when the best results were pretty much toys. Today's machine learning systems learn in ways that even their creators don't fully understand, and they have unparalleled power to perform goal optimization.

Unfortunately, the machine learning systems used by social media companies are asked to optimize one goal: Profit. These algorithms do not understand the effects they can have on people's mental health. These algorithms are not taught to optimize against suicide. These algorithms are not taught to protect democracy. These algorithms are not taught that genocide is a bad thing. As a result, when they optimize for profit, all of the harms they cause are simply not considered at all.

Put bluntly, if genocide in Myanmar increases Facebook's bottom line, then Facebook's algorithms will do their level best to incite such a genocide.

Social media companies sell access to the vast power of these machine learning systems. When that access is used to sell you shoes or sunglasses, the harm to society is minimal. Unfortunately, that access is used by bad actors to promote hatred, damage elections and create chaos. Human society is not going to be destroyed by a malevolent AI bent on our actual destruction. It's going to be destroyed by machine learning systems that simply don't care what they are doing as long as they maximize profit. The algorithms themselves are blissfully ignorant of the harm they are causing, and the engineers who created the algorithms seem content to give them free rein.

Worse, it is apparent that some social media companies deliberately increased the harmful addictiveness of their products to increase profits. Tim Kendall, director of monetization at Facebook from 2006 through 2010 testified before the US Congress:

"The social media services that I and others have built over the past 15 years have served to tear people apart with alarming speed and intensity. At the very least, we have eroded our collective understanding-at worst, I fear we are pushing ourselves to the brink of a civil war."
[...] we sought to mine as much attention as humanly possible and turn into historically unprecedented profits. To do this, we didn't simply create something useful and fun. We took a page from Big Tobacco's playbook, working to make our offering addictive at the outset.

Here is Kendall's full testimony (PDF).

But Knowledge is Power?

OK. So you've watched the Netflix documentary and you understand the algorithms. Surely you can resist them?

No. No, you can't. Machine learning is simply too powerful and is capable of hacking the brains of anyone, including those who realize they're being hacked.

Watch a skilled magician doing a trick. Even if the trick is explained to you, you will still be amazed and fooled when the magician does the trick again. That is because our brains are hackable even when they should know better.

What can we do?

Quit social media. Or if you can't do that (for example, most amateur comedy gigs are booked on Facebook), limit who and what you follow and use the social media platform strictly for your own purposes and avoid being drawn in to other things.

Don't use Google or other search engines with similar machine-learning profit algorithms. Use something like DuckDuckGo instead. You can make DuckDuckGo your default engine in Firefox and Chrome. (But don't use Chrome. That feeds into Google's machine-learning engines.)

What can governments do?

The social media companies are very unlikely to change their algorithms to optimize anything other than profit. Therefore, the only incentive to change the outcomes would be to make damage caused by social media eat into their profits. Things that I would like to see include:

The algorithms are very good. They will quickly adjust their behaviour to once again maximize profit, except this time, the path to maximizing profit will include requirements to minimize harm.

If social media companies do not submit voluntarily to harm-reduction measures, then governments should ban them outright.

Update 2020-10-02: Facebook's Rebuttal

Facebook has posted a rebuttal (PDF) of the film "The Social Dilemma". This rebuttal is no rebuttal at all. It's spin. Let's take a look.

Our News Feed product teams are not incentivized to build features that increase time-spent on our products. Instead we want to make sure we offer value to people, not just drive usage.

According to the people interviewed for the documentary, the above statement is a flat-out lie. It certainly sounds like brand-spin... "offer value to people" WTF?

For example, in 2018 we changed our ranking for News Feed to prioritize meaningful social interactions and deprioritize things like viral videos. The change led to a decrease of 50M hours a day worth of time spent on Facebook. That isn't the kind of thing you do if you are simply trying to drive people to use your services more.

Standard tactic: Talk about a big number as if it's impressive without mentioning the much bigger number. Facebook has about 1.4 billion daily active users. Taking 50 million hours away from that pool means each person spends on average 2.14 minutes less time per day on Facebook. Whoop. De. Do.

We collaborate with leading mental health experts, organizations and academics, and have our research teams devoted to understanding the impact that social media may have on people's well-being.

Which experts? Post links to research results. Be transparent about exactly what you're researching and what you're doing with the research.

But even when businesses purchase ads on Facebook, they don't know who you are.

Maybe they don't know your name, but Facebook's ad segmentation features allow advertisers to fine-tune ads to an incredible degree. They might not know your name, but they know you're a 40-year-old woman who likes dogs, leans Democrat, has a slight weight problem, and lives in Tucson in an area with slightly higher than average family income.

Algorithms and machine learning improve our services.

This is absolutely true. Machine learning does improve Facebook if by "improve" you mean "increase Facebook's revenue." Because that is the goal that the machine-learning algorithms are given.

The truth is that polarization and populism have existed long before Facebook and other online platforms were created

Yes, they did. Did they spread as easily and as effortlessly? Did they paralyze the most powerful nation on Earth by fracturing society to the point where they can't even respond to a pandemic? No, they didn't. The Internet in general and social media in particular are to thank for that.

We've removed more than 100 networks worldwide engaging in coordinated inauthentic behavior over the past couple of years,

Again, mention an impressive number out of context. They've removed 100 networks. Is that 100 out of 100? Or 100 out of 10,000? Nobody knows, and Facebook's not saying (or they don't even know themselves.)

The idea that we allow misinformation to fester on our platform, or that we somehow benefit from this content, is wrong.

Nope. It's 100% correct. A casual perusal of Facebook will reveal mountains of misinformation and lots of engagement about the misinformation, and that directly feeds into Facebook's revenue.


Copyright © 2024 Dianne Skoll