Facebook is Creating Micro-Bubbles, Giving Us a False Sense of Political Reality

Your News Feed is prioritizing what you want to read over what you should be reading about the election...
Avatar:
Author:
Publish date:
Social count:
810
Your News Feed is prioritizing what you want to read over what you should be reading about the election...

The divisiveness in American politics today is almost entirely attributable to the existence of epistemic closure. Simply put, epistemic closure is an advanced and extreme form of confirmation bias -- or, in a colloquial sense, it's what we often refer to as bubbles or echo-chambers.

Epistemic closure is most often illustrated by Fox News Channel and AM talk radio in which both viewers and listeners are walled off in their own universes of misinformation, and anything that originates from outside the bubble, no matter how true, is always regarded suspiciously. Fox News and the lengthy roster of talk radio screechers give their bubble-dwellers only the information that confirms their ideology. Nothing else. And by demonizing information outside the bubble, the message can be sufficiently distorted to the point of virtual brainwashing.

Without any tethering to reality, bubble people suffer from highly distorted worldviews -- Obama is a Kenyan terrorist, Hillary killed Ambassador Stephens (among others), the climate crisis is a hoax and Donald Trump is absolutely qualified to be president. No matter how much contradictory information is thrust under the noses of rage-addicted bubble people, they simply write it off as part of the liberal media conspiracy.

From there, we get nightmarish Tea Party governors and members of Congress who, likewise, legislate based on the contents of the bubble. For example, it's a given within the bubble that fetuses can feel pain at 20 weeks when medical science puts the threshold at 27 weeks. But, obviously, the scientists are liberals and therefore the enemies of conservative values so the information is discarded with prejudice.

While the internet was supposed to democratize both journalism and the availability of information, broadening our understanding of the world and bringing us together in the spirit of communication and a healthy exchange of ideas, social media manipulation, on the other hand, has begun to wall us off into our own little micro-bubbles.

The other day, I noticed that I was getting an unusually high concentration of links to favorable articles about Batman v Superman: Dawn of Justice in my Facebook News Feed. For a moment I wondered whether it was because the movie was perhaps more popular than critics let on. But after a second or two it occurred to me that it was the Facebook News Feed algorithm. As a comic book movie nerd, I've clicked on quite a few articles about the movie, mostly positive reviews and promotional videos. 

It turns out, Facebook was prioritizing what it thinks I want to read, like, share and click.

In lay terms, an algorithm is a mathematical formula that, when coupled with back-end coding, solves problems or churns out a desired and customized series of results. In the context of Facebook, your News Feed is rigged so you're only getting information based upon what you've already liked/shared/clicked. In its February update, Facebook's algorithm results were further refined based on both prior engagement and future engagement. In a way, Facebook will predict what articles and statuses you're going to enjoy and automatically prioritizes them at the top of your feed.

According to Wallaroo Media:

With this latest update, Facebook will gauge the likelihood that users will highly rate a post or the probability that users will interact with a post by liking, commenting, or sharing.  These posts will then be placed at the top of the user’s News Feed.

Hence, micro-bubbles. Again, Facebook is allowing us to fabricate our own bubbles based on our specific click preferences, making sure that we generally see information that fits neatly with our interests, while de-prioritizing unwanted information.

Now, add the current presidential election to the mix. 

This is merely theoretical on my part, but it's possible that the divisiveness within each party -- Trump vs Cruz, or Bernie vs Hillary -- could be based on the fact that supporters of each candidate are only seeing information favorable or, in some cases, unfavorable to their candidate of choice. 

  • If Bernie supporters have only been liking/sharing/clicking on pro-Bernie articles, the algorithm might prioritize favorable Bernie articles, plastering them at the tops of pro-Bernie News Feeds, thus presenting a false sense of Bernie's excellence. So, when he's attacked or pundits offer a negative prognosis of his chances for the nomination it doesn't appear to coincide with the algorithmic reality that pro-Bernie Facebook users are seeing in their feeds. Resentment and conspiracy theories ensue. 
  • Or with Hillary supporters, if they're clicking on anti-Hillary articles in order to debunk them or to counterattack Bernie people, they're probably seeing a glut of unfavorable articles at the tops of News Feeds. This might be contributing to the feeling that Hillary is being unfairly or perhaps misogynistically criticized -- after all, look at all these anti-Hillary articles on Facebook!  Resentment and conspiracy theories ensue.   
  • Or, due to like/share/click engagement, some Bernie people might be getting a lot of anti-Bernie articles in their News Feeds, giving the impression that the news media has an anti-Bernie bias.  Resentment and conspiracy theories ensue.   

While it's always possible each analysis could resemble the truth, it's more likely that reality is somewhere in between. Unfortunately, neither side is getting a clear view of reality due to the algorithm. The algorithm is also contributing to the viral prevalence of obvious propaganda sites like USUncut, which only gives readers what they want to read rather than what they need to read. Facebook almost single-handedly fuels the popularity of these sites -- USUncut is bolstered by a whopping 1,496,257 likes and counting. When reality, in this case the delegate count, doesn't play out according to the wild predictions of USUncut's Mark Provost or Huff Post's HA Goodman, supporters think they've been hoodwinked by the system rather than the cherrypicked information itself. Again, resentment and conspiracy theories ensue.

Regardless, bubbles are dangerous, and Facebook regulars are being walled off and insulated from reality due to the algorithm. It might be a positive thing to have more Americans locked into civics and political conversation, but the upsides are damaged by epistemic closure -- these individual micro-bubbles -- and fewer people are married to objective reality. Instead, distorted viewpoints give way to rage and frustration when the bubbles are burst by the truth.