They’re Watching You: How Social Media Helps Shape Your Reality

Photo by Glen Carrie @ Unsplash.com

Facebook, YouTube, Twitter, Instagram, and other social media all want the same thing: to hold your attention for as long as they can. The more time you spend with their services, the more ads you see—and the more money they make.

So they program their computers to feed you content they think you’ll like. The computer program that’s deciding which of your friends’ posts to display in your Facebook news feed is called an algorithm.

These algorithms are now famously powerful: they draw you in. Consider the case of Caleb Cain, a young college dropout with a liberal bent. He was aimlessly looking for distraction when YouTube recommended a video by Stefan Molyneux, a Canadian talk show host offering self-help advice—but who also offered an “anarcho-capitalist” view that advocated men’s rights and criticized feminism as a form of socialism.

Nothing too extraordinary. But after Caleb watched that video, YouTube’s algorithm began cueing up videos of far-right commentary and then racist videos (some from channels that have now been blocked). He watched some 12,000 videos over a period of four years.

Eventually Caleb became convinced of such white-supremacist notions as Western civilization is threatened by Muslims and that IQ disparities explain differences among races.

“I just kept falling deeper and deeper into this, and it appealed to me because it made me feel a sense of belonging,” he told the New York Times. “I was brainwashed.”

He said it felt as if he were chasing uncomfortable truths and like it was giving him “power and respect and authority.”

Like YouTube, Facebook is also intent on holding our attention by putting content in our news feeds that will appeal to us. I have 769 friends, but Facebook shows me only the posts they think will interest me. How do they know what that is? They track everything I do: every post, every like, every comment, and even data from my other internet activities.

In the beginning, Facebook was fun. It was mainly a person’s real friends, and the news feed simply showed their most recent posts. But as the service grew, and the number of posts became voluminous, Facebook changed their algorithm in 2007 so that it began selecting specific posts to show users.

The criterion was simple: select those posts most likely to keep you engaged—whatever that might be. And that’s the rub. Those posts could be fun or infuriating, holding your attention through delight or keeping you engaged in vitriolic argument. In every case, they were reinforcing your mindset.

This selectivity has its downsides, of course, by exposing us to a narrow range of reality. Studies have found that approximately one-third of Americans get their daily news from Facebook. And of course that news is greatly skewed toward their point of view.

In 2010 Facebook launched their groups feature. You could now join groups, including private ones, that included people who shared a specific interest. Eventually Facebook’s algorithm not only selected specific posts for you to see, but also began recommending groups for you to join.

One result was that people like Caleb were drawn into increasingly extreme views and ended up in groups such as QAnon, famous for promoting wild conspiracy theories.

Facebook algorithms not only can shape our reality in this way, they can also affect our mood. In 2012 Facebook ran an experiment on 689,000 randomly selected users, seeking to determine if their algorithm could affect the mood of their users.

They found that moods were contagious. People to whom the algorithm fed more positive posts responded by writing posts that were more positive. Those who were fed more negative content tended to be more negative in their own posts.

Overall, social media has seemed to foster divisiveness in the U.S. and to help create an epistemological crisis whereby we can’t find a common ground of truth based on fact.

Thankfully, things are starting to change. Facebook has blocked a range of extremist groups, and said they would stop suggesting groups for users to join. They’ve also decided to change their algorithm so that fewer posts related to politics show up in news feeds. People were complaining that Facebook was becoming increasingly unpleasant because of all the conflict related to politics. (And yes, Rick, I did recently block you on Facebook.)

YouTube has also changed its algorithm so that it doesn’t increasingly draw a viewer into ever more extremist videos.

Despite these changes, certainly the deeper issues will remain. It’s an interesting time, and definitely a source of cognitive dissonance. How we can come to agreement on the nature of reality? Can we figure out a way to live together? Can we distill the reasons why people are unfulfilled such that they’re drawn to extreme points of view?

At the least, I think simply confronting these questions can make one a better person—and be a step in the right direction.

Find column archives at JimKarpen.com.