Facebook’s Simple Approach To Bursting Bubbles

With people living in political bubbles, Facebook's simple approach could burst them


One of the biggest stories of the past 12 months has been stories. That is to say, the stories that people have been reading which feed only into their pre-existing worldview, in turn leaving them more susceptible to fall for fake news stories, simply because they seem reasonable in context.

For instance, a recent fake news story claimed that one of Hillary Clinton’s assistants had been found dead in suspicious circumstances and insinuated that she had been responsible. Snopes researched the story and found that, of course, it was completely fabricated, but it was believable for those who dislike Hillary Clinton, because that fits with the worldview they subscribe to. This is largely fed through the use of social media feeds and only reading specific news sites rather than reading anything that might challenge your worldview.

It has caused some considerable issues, with many blaming the phenomenon for the rising populism in western countries that led to the election of Donald Trump and led to people voting for Brexit in the UK. It is undeniable that these have an impact, with 62% of Americans getting at least some of their news from social media feeds, which will naturally mean that these bubbles being formed and entrenchment of ideas starting. It explains why there has been such a huge level of vitriol towards ‘the other side’ when these things are discussed, it is difficult to understand how the other side can believe these things when everything you’ve been shown is contrary to that belief.

As somebody who works within digital publishing, working on websites with thousands of visitors every day, I am acutely aware that everything has a slight bias to it. This is simply how things are. My beliefs, as much as I try to keep them outside of my writing and editing, will impact what I do. However, the team will always double check every quote, question every statement, and quantify every stat, because that is the only way to try and counterbalance this. However, as a comparatively small digital publishing company, this makes it relatively simple.

When you are the size of Facebook, Twitter, or even Linkedin, it becomes increasingly difficult. There are roughly 4 millions ‘likes’ every minute on the site, which shows the size of the problem they face when vetting content.

However, to try and combat the ‘bubbles’ forming, Facebook has begun testing a simple change that could change the way people consume media on the site. Instead of people following specific pages that cover topics, Facebook will allow people to simply follow those topics. This move will force people to look at multiple perspectives on the same topic. Tech Crunch, who brought us the story, reported that at present the topics being tested are fairly innocuous - ‘For now it almost seems that Facebook purposefully strayed away from controversial or polarizing topics. The most incendiary one I saw might have been “Ocean Science & Conservation.” But you can imagine how the feature might work for topics like “Donald Trump,” “Healthcare,” “Refugees,” “Taxes,” “Terrorism” or others.’

It is a simple solution to a problem that many digital publishers, aggregation sites, and social media outlets are finding. Whether it will work is yet to be seen, but theoretically it is a strong move from the California based social behemoth. However, with the popularity that this kind of polarization has brought to sites like Breitbart and Salon, is it in their interests to change it?


Read next:

Hybrid Approach Needed To Drive True Digital Transformation