Facebook is Radicalizing Us

Advertisements
Advertisements

8 min read

FacebookTwitterLinkedInPrintFriendlyShare

Jewish tradition shows us how to resist.

“Our algorithms exploit the human brain’s attraction to divisiveness.”

That was the frank assessment of an internal report at Facebook’s corporate headquarters in 2018. The algorithms employed by Facebook – deciding what ads and posts we see based on our past behavior on the site – were designed to inflame our passions. If a user clicked on a political post, for instance, Facebook would suggest further reading a similar point of view, but with one crucial difference: the new suggested content would be of an ever more extreme nature. This would ensure that Facebook users would see “more and more divisive content in an effort to gain user attention and increase time on the platform.”

The report drew on two years of internal research and painted a troubling picture of social media users being manipulated to become ever more extreme and radical. In 2017, Facebook formed a committee called “Common Ground” made up of engineers and researchers to assess how divisive content was being shared on Facebook. What they found was deeply troubling. The algorithms governing Facebook’s user experience were designed to maximize “user engagement”. Success was judged by the length of time users spent on Facebook, the number of posts and articles they shared, and the number of “likes” and other reactions they clicked. The most effective way to do this, the committee found, was to feed users ever more extreme content.

Instead of acting on the internal report, Facebook experts shelved it. They didn’t want to weaken the site’s addictive hold.

Instead of acting on the internal report, Facebook experts shelved it. They felt that changing the algorithms would weaken the addictive hold that the site held for many of its users. The report was buried, and only reported recently by the Wall Street Journal.

This means that for years, billions who regularly use Facebook have been unwitting subjects in an experiment of mass radicalization. Our emotions have been manipulated, our dislikes and hatreds amplified, and we’ve been fed an ever-increasing diet of content that’s designed to provoke outrage. What’s the result? Just look around: it’s not hard to find division and hatred permeating every facet of modern society. While it’s hard to know exactly where all this bad feeling came from, viewing angry, screaming social media posts can hardly help.

It’s not only Facebook: a whole host of other media have found that stoking hatred and extreme views is a winning formula, racking up user engagement while diminishing our civility and stoking division. Just eight years ago in 2012, researchers at Emory University in Atlanta polled US voters and found that fewer than half felt deep anger towards candidates or voters from the other party. By 2016, that changed and nearly 70% of Americans reported feeling deep anger at those who supported candidates from the opposite political party.

“We find that as animosity toward the opposing party has intensified,” noted Stanford University researchers in 2018, “(anger) has taken on a new role as the prime motivator in partisans’ political lives… today it is out-group animus (hatred towards one’s political opponents) rather than in-group favoritism (supporting one’s favored candidate) that drives political behavior.”

Online hate is a “disease” that spreads according to Princeton Professor Joel Finkelstein, who studies extremism on the internet. He’s examined hate speech on American online message boards, and has found startling similarities between some popular American sites and radical Islamist message boards. In both cases, users exposed to divisive messages can become radicalized. In some cases, Dr. Finkelstein has found that radical content viewed online can mutate into the “real” world, motivating some people to commit hateful acts after viewing extreme content online.

The shooter in the 2018 attack on the Tree of Life Synagogue in Pittsburg, who murdered eleven Jewish worshippers, and the attacker who shot fifty people in two mosques in New Zealand a few months later, both acted after viewing extremist content online on popular social media sites. “Both attackers were enmeshed in online communities that exposed them to content designed to make them hateful and potentially violent,” Dr. Finkelstein found.

Extremist content is poisoning the current atmosphere of protests and riots sparked by the murder of George Floyd. “On Twitter and Facebook,” the New York Times recently reported, “hundreds of posts are circulating” providing distorted, hateful and flat out wrong information and conspiracy theories, fanning the flames of mistrust and hatred. Social media posts about George Floyd surged in the days after his death. Nearly nine million posts mentioned him in a given day: that’s more than the number of social media posts that mentioned pro-democracy protests in Hong Kong (1.5 million) or the Yellow Vest protests that rocked France last year (just under a million mentions on social media per day at its peak). Many of these posts were inflammatory, stirring up yet more hatred and ill feelings.

With the aid of social media, we’re increasingly talking past each other, demonizing our perceived opponents and embracing extremism and anger.

With the aid of social media, we’re increasingly talking past each other, demonizing our perceived opponents and embracing extremism and anger. While it’s not easy to resist the siren lure of ever more extreme social media, there are few concrete steps we can take today that offer a powerful antidote to the current stew of anger, recrimination and outrage that many of us view daily online and elsewhere.

1. Get off social media.

Heavy social media use is associated with poor mental health. One large study found that people who spent more time on social media were three times as likely as people who were “light” users of social media to be depressed. Social Media use was “significantly” associated with increased depression. Another study found that young people who interacted with social media for two hours a day or more were much more likely to rate their mental health as “poor” than those who used social media only occasionally or not at all.

Interacting with people in the real world is much more satisfying – and can protect us from viewing extreme content online. Try turning off social media; consider taking a week-long detox. Shabbat is a reprieve from social media and other electronics.

2. Be a more discerning media consumer.

“When people are fearful they seek information to reduce uncertainty,” explains Stanford Communications Professor Jeff Hancock, who has studied the role of extreme and misleading social media posts in the current coronavirus pandemic. “This can lead people to believe information that may be wrong or deceptive because it helps make them feel better, or allows them to place blame about what’s happening” elsewhere, he warns. With so many people fearful and anxious about the state of the world today, radical social media posts can offer a reassuringly simple lens through which to view current events.

Instead of blindly accepting extreme posts, take the time to look where they came from. Are they from a reputable source? Can you verify them elsewhere? A few weeks ago a friend of mine posted a long article on Facebook purporting to be from a medical expert at a major hospital; to my friend’s chagrin, it turned out the author of the article didn’t exist, and much of the information contained in it was wrong. Sadly, that’s the case with many social media posts.

Prof. Hancock suggests subscribing to a few reputable news sources and getting our news from there. If a story looks interesting on social media, check it out on mainstream news sites to make sure they’re true and to get a background to the story.

3. Take the time to listen to other people’s points of view.

Rabbi Jonathan Sacks likes to tell the story of attending a conference years ago. After the first day, his wife asked him how it was going. “The speaking is brilliant,” he told her; “the listening is nonexistent.” It’s all too easy to assume that we know what other people are going to say, or to dismiss our interlocutors. To truly learn and grow, however, we have to take the time to listen to others – to hear their stories and strive to understand their points of view.

This isn’t easy to do in the universe of social media of course, where loudness and brio are prioritized over real listening. But if we truly want to engage with other people, we have to take the time to truly pay attention to what they have to say and spend time imagining the world from other people’s points of view.

4. Watch your words.

Online words can have real world consequences. Connect Safely, a Silicon Valley organization that helps monitor social media use, has noted that Facebook and other social media sites frequently contain offensive words and phrases used to demonize ethnic minorities and other social groups. Words like “animals”, “garbage”, “trash”, “invaders” and calling people names of insects or diseases dehumanize groups of people online.

Make a decision not to use these terms and only speak about people respectfully. This can go a long way in helping to keep online exchanges civil, and tone down the outrage in social media posts.

Click here to comment on this article
guest
0 Comments
Inline Feedbacks
View all comments
EXPLORE
LEARN
MORE
Explore
Learn
Resources
Next Steps
About
Donate
Menu
Languages
Menu
oo
Social
.