As the Trump presidency steams ahead, it’s no secret how polarized the country has become around political and cultural issues, with the increasing formation of “echo chambers” made up of people holding similar beliefs. For example, Trump’s recent claim that three to five million people voted illegally is yet another fault-line in the ongoing divide.

But what drives this polarization?

Social media like Facebook and YouTube offer platforms for users to interact with content that often falls on one side of hot-button topics or another, whether related to political races, abortion, or the environment. The algorithms these sites use to promote content to users can amplify divisive messages by reinforcing points of view they already hold and making them more likely to reject or attack alternative perspectives.

At the same time, the content itself — and how people interact with it — can be polarizing, regardless of any algorithm in place, again making digital echo chambers louder and more polarizing.

To understand the factors that go into polarization related to online content, I conducted research with Italian collaborators Alessandro Bessi, Fabiana Zollo, Michela Del Vicario, Michelangelo Puliga, Antonio Scala, Guido Caldarelli, and Walter Quattrociocchi. Specifically, we studied how 12 million people consumed digital content (videos) related to science and conspiracy on Facebook and YouTube.

We found that how content is delivered supports the development of and amplifies echo chambers, but the content itself is especially influential in the polarization process. This goes against the more optimistic view that online content broadens people’s views and may make them more interested in or accepting of other people’s preferences.

Feeding Division

People already generally choose to read online information that supports their beliefs, and to form groups with similar opinions, or what we think of as “echo chambers.” That’s sufficient in itself to promote polarization and conflict between groups with clashing views.

In fact, we found that 93.6 percent of Facebook users and 87.8% of YouTube users are polarized — that is, they concentrate their online reading and interaction (at least 95% of it, as we defined it) — around one specific narrative related to a given controversial topic such as climate change.

So echo chambers are already crowded.

But social media algorithms can play a role in intensifying the beliefs of polarized groups even further. Specifically, algorithms are used to promote specific content to specific users. The Facebook News Feed algorithm, for example, delivers to users content read or liked by the Facebook friends they interact with most. YouTube’s Watch Time algorithm prioritizes videos a given user is most likely to watch for longer, given past viewing patterns.

So, if someone reads or watches content that promotes conspiracies related to Hillary Clinton or Democrats more broadly — or to Donald Trump and Republicans — Facebook is more likely to show them similar content they haven’t yet seen, and less likely to offer content that contradicts their view, reinforcing (and even exaggerating) currently held beliefs.

It’s fair to say, then, that the algorithm amplifies the echo chamber.

Content’s to Blame

While algorithms do play a role in polarization, we find that content is ultimately “king” when it comes to dividing people.

When we looked at patterns of digital content consumption on both Facebook and YouTube, we found, not surprisingly, that some users read and comment only on content that fits a specific narrative around a controversial topic — such as rejection of climate change — reflecting early polarization.

But others were more willing to interact with content reflecting both sides of an issue: arguments for and against climate change, in this example. That might be taken to mean that a significant proportion of people is less susceptible to polarization.

Sadly, that’s not true. We found that the vast majority of the group initially open to both kinds of content would eventually move to consuming just one type of information, becoming polarized toward one of the narratives. So they might consider both sides of the climate change controversy at first, but ultimately favor one side over the other.

In fact, we were able to predict reliably the evolution toward polarization, with similar patterns across Facebook and YouTube.

A Dangerous Echo

The formation of echo chambers seems to be a natural consequence of how we as humans naturally interact with content, and how digital content is delivered to us.

The danger of echo chambers, however, is that they can disengage those within them from more mainstream society and established practice, fueling potentially dangerous beliefs and behaviors, such as in the case of widespread anti-vaccine content. In fact, the World Economic Forum considers massive digital misinformation a major threat to society.

In line with this, polarization, as promoted by echo chambers, can have highly negative consequences. A stark example of this happened in late 2016, when a man opened fire in DC-area pizza parlor Comet Ping Pong after hearing false rumors about Hillary Clinton’s link to the establishment. Similarly, it has been argued that Putin’s Russia took advantage of growing polarization in the US in an attempt to influence the presidential election through release of damaging material. Short of such dramatic outcomes, echo chambers prevent mutual interest and understanding among groups, promoting collective and individual conflict and mistrust.

While our research was not designed to offer solutions to the growing issue of polarization, we believe this represents a mounting problem for society, one that will likely only intensify. Public and private decision-makers would be well-advised to pay close attention to the factors driving polarization, along with those that promote greater inclusiveness and healthy diversity, keeping in mind what our research shows.

________________

By: Brian Uzzi is a professor at Kellogg School of Management at Northwestern University and a globally recognized scientist and speaker on leadership, social networks and new media. Professor Uzzi is a co-chair for the annual International Computational Social Science Summit .

Originally published at medium.com