When we’re only surrounded by the opinions we agree with, it can give us a somewhat distorted view of the world around us.
If we never see the opposite perspective, how can we possibly make unbiased decisions? Living in a so-called online ‘filter bubble’ of cherry-picked information created by algorithms built on personal preferences and browsing history, means that we only see what we and others with the same views as us want to see.
Share and share alike
Skim. Click. Comment. Share. We do it day in and day out, often without a second thought as to whether or not the information we’re sharing is actually factually correct, and if there’s an opposing view that we should take into account. We’re inside our filter bubble and we’re only seeing the version of the world that we want to see, and only the one the preference algorithm creates for us.
The danger arises when we don’t realise that what we’re seeing is filtered. This state of ‘intellectual isolation’ has the ability to increase political and social polarisation and promote extremism in otherwise balanced people. Because when we don’t see both sides of the argument, we leave ourselves open to propaganda and manipulation.
Trust me, I’m telling you stories
We may think that filter bubbles as a way of spreading ‘fake news’ is a relatively new phenomenon, perpetuated by social media, when in fact human beings have always lived in the age of ‘post-truth’. Lies and fictions have always been all around us, long before Facebook or Trump’s tweets, as author Yuval Noah Harari notes in his book ’21 Lessons for the 21st Century’.
“Homo sapiens is a post-truth species, whose power depends on creating and believing fictions. Ever since the Stone Age, self-reinforcing myths have served to unite human collectives. Indeed, Homo sapiens conquered this planet thanks above all to the unique human ability to create and spread fictions. We are the only mammals that can cooperate with numerous strangers because only we can invent fictional stories, spread them around, and convince millions of others to believe in them.”
Are filter bubbles eroding democracy?
For democracy to prevail, we all need to be equally informed. But how can this happen if we’re only seeing a filtered view of the world?
It’s not just the internet that’s to blame – biased media sources such as TV and newspapers play their part too, swaying voters in elections to suit their own political leanings.
However, the internet has a much more dramatic effect thanks to preference algorithms. In the 2016 US election and the Brexit vote, filter bubbles put people in a state of intellectual isolation from alternate viewpoints. With everyone in their own Facebook or Google bubble, until the election results, half of the US didn’t realise that the other half of the country was frustrated enough to elect Trump. And by then it was too late.
The same of course could be said of Brexit. And there is even growing evidence that a certain amount of the falsehoods passed around during the 2016 US election were part of a Russian propaganda plot to sway the result. (1)
Voters should always be aware of the political messages they see online. Unregulated political messaging presents a difficult conundrum between the importance of free speech and the possible manipulation of the undecided voter.
But while filter bubbles undoubtedly limit political diversity, it has to be noted that an element of the bubble is due to the user’s choice. For example, no matter what algorithm Facebook uses to display what we see on our news feed, we’re still naturally more likely to friend or follow those who share the same beliefs as us – that‘s just human nature. And even when people are given the choice of clicking on a link to contrasting views, they’re still most likely to prefer their most viewed sources, as a study by Facebook’s data scientists has shown.(2)
“61 percent of millennials use Facebook as their primary source for news about politics and government…” Pew Research
So with the recent announcement of the launch of ‘Facebook News’ in the UK, the tech giant’s algorithm is set to deliver a range of ‘personalised’ news based on the reader’s interests in an attempt to end their long-running frictional saga with news publishers, as advertising spend has continued to move to big tech rather than smaller, independent or local news outlets.
Aiming to support the long-term viability of newsrooms, the claim is that smaller news organisations with low reader numbers who sign up to the platform will have the opportunity to put their content in front of new audiences, as data from the US shows that more than 95% of traffic on Facebook News is from people who have not read these particular publications before. (3)
Will this really give us a more objective view of the news? Or will the personalisation settings simply keep us firmly in our filter bubbles? Only time will tell.
How do we make sure we’re getting an objective view of the world?
You have to have your wits about you in the era of the filter bubble.
Be aware of micro-targeting, don’t blindly believe everything you see or read, and investigate why you are seeing particular adverts, for example by clicking the ‘why am I seeing this ad?’ button on Facebook.
Delete or block browser cookies, delete your search history and view news sites that aim to offer a wide range of perspectives.
To combat the spread of ‘fake news’ takes effort in a so-called ‘post truth’ era, where the public has lost trust in a variety of institutions. The authenticity of information we’re presented with is often difficult to distinguish – is it merely satire or a malicious hoax?
Question, question, question.
Because only then can we see the full picture.
- Isaac M, Wakabayashi D. Russia influence reached 126 million through Facebook alone. New York Times. 2017; October 30.
- Bleiberg, Joshua; West, Darrell M. (2017-05-24). “Political polarization on Facebook”. Brookings Institution. Archived from the original on 2017-10-10. Retrieved 2017-05-24.
- https://www.bbc.co.uk/news/technology-55799656