“Google it” or “Ask Alexa” are phrases engrained in English language. We have become reliant on the world wide web and wonder sometimes how we ever functioned without instant access to information. Indeed it is very useful to have access to this amazing data source, however we need to take a step back and see if we have access to the full dataset, is this dataset accurate or just receiving a trickle of data that an algorithm somewhere has decided we would like to access.
A significant number of people in my social circle have no idea they live in an online filter bubble. How many times have you looked at an item of clothing online and from then on were constantly bombarded with ads for the item every time you went online. As far back as 1994 Jeff Bezos, Amazon CEO was one of the first to realise the potential of advertising algorithms. He wanted to match books to customers based on their interests.
(Pariser, 2014) The push is still by Amazon and many others for even more and more data to be extracted from us. Every time we use the internet algorithms are working non-stop to target users for perhaps not so transparent reasons. Some onlookers may say “What harm is that it is hurting no-one”. When we take the time to dig a little deeper into filter bubbles it is obvious this is most certainly not the case.
Eli Pariser
Wikipedia (2021) defines a filter bubble as “a state of intellectual isolation that can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behaviour and search history.” This is a very clear and fair explanation; it is however very scary and gives up a vibe of “Big brother is watching you!”. You would wonder is enough attention been given to moderating and investigating the impact these filter bubbles are having on us a society. Why do we hear so little about in the mainstream media. Two examples of where filter bubbles are common place would be personalised Google search results and personalised news streams from Facebook.
Exposure to filter bubbles can cause people become removed. The lack of exposure to data that disagrees with their viewpoints, is effectively isolating them in their own cultural or ideological bubbles. No one person set out to create these algorithms to do this distort our online experiences, it has evolved over time and are now part of our reality. If you take a simplistic view of filter bubbles you would perhaps think “isn’t it great to be exposed content you like without having to go searching for it”, For example seeing people on your social media feeds they suggest you make contact with because of similar interests or the fact they live in your locality. In some instances it is great but in the majority this is not the case. This personalisation of social media can effectively trap users by reinforcing their belief systems by presenting misleading or inaccurate content that has not been verified by that social media provider as being a reliable source. Continued exposure to this can lead to less tolerance for and opposing point of view.
In a Quartz article Bill Gates (2017) describes technologies such as social media “lets you go off with like-minded people, so you’re not mixing and sharing and understanding other points of view … It’s super important. It’s turned out to be more of a problem than I, or many others, would have expected”. If we look at some worrying examples of filter bubbles, for example Donald Trump winning the US presidential election in 2016 or Brexit. In the case of Brexit those who voted to leave were an older demographic, less active online. Those who voted to remain were younger and active online. Who was correct can be debated but the issue is in order for Democracy everyone needs to be equally informed both on and offline.
(Kelly, 2021) wrote an article on research carried out for Princeton University by Jacob Shapiro who carried out some research around voting and biased search rankings i.e. Russia paying for Facebook ads to alter opinion before Donalds Trump election as president. The results were interesting if not worrying peoples voting preference could be moved by 20% or more of undecided voters but one of the main findings was people showed no awareness of filtering of search results a worrying trend.
The extend of power of the filter bubbles was by Barak Obama in his farewell speech after loosing the US presidential election to Donald Trump. Obama (2017) spoke about his concerns stating: “We retreat into our own bubbles, … especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions. … And increasingly, we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there”
(Jackson, 2017) believes the real danger of these filters that people do not realise the content has been customised based on your interests and beliefs. He states “Some of these problems our fellow citizens are having kind of disappear from view without our really even realising”. Then presenting Fake news which people take to be true for the most part.
What can we do to protect ourselves from the Bubble we find ourselves in. The fs.blog (2021) had some useful advice:
- Use ad-blocking browser extensions
- Read news sites and blogs that offer a wide range of perspectives
- Use incognito browsing, delete your search histories
- Delete or block browser cookies (read terms don’t just accept all)
What excuse do we have for allowing this to continue happening when we were informed of the dangers a decade ago by Eli Periser. Facebook made a token effort by asking users to flag fake content and employing third part-fact checkers. Periser went on to found Upworthy to effectively try and counteract these algorithms and reduce the power of the filters, believing in the power of the web to be a place “tremendous for empathy”. A statement I agree with when we look at Gofund me organisations and the support people receive and give. All is not lost who knows what the next few years will bring hopefully some more awareness and users taking the power back over their own data. Never underestimate the power of a bubble!
Sources
Barak Obama, 2017. “I’m asking you to believe. Not in my ability to bring about change — but in yours.”. [online] Available at: <https://obamawhitehouse.archives.gov/farewell> [Accessed 14 December 2021].
En.wikipedia.org. 2021. Filter bubble – Wikipedia. [online] Available at: <https://en.wikipedia.org/wiki/Filter_bubble> [Accessed 14 December 2021].
fs.blog, 2021. How Filter Bubbles Distort Reality: Everything You Need to Know. Available at: <https://fs.blog/filter-bubbles/> [Accessed 14 December 2021].
Jackson, J., 2017. Eli Pariser: activist whose filter bubble warnings presaged Trump and Brexit. [online] the Guardian. Available at: <https://www.theguardian.com/media/2017/jan/08/eli-pariser-activist-whose-filter-bubble-warnings-presaged-trump-and-brexit> [Accessed 14 December 2021].
Kelly, R., 2021. Shapiro: Tracking and reacting to Russian attacks on democracy. [online] Princeton University. Available at: <https://www.princeton.edu/news/2018/01/16/shapiro-tracking-and-reacting-russian-attacks-democracy> [Accessed 14 December 2018].
Pariser, E., 2014. The filter bubble. 1st ed. New York: Penguin Books.
Pariser, E., 2019. What obligations do social media platforms have to the greater good?. Available at: <https://www.ted.com/talks/eli_pariser_what_obligation_do_social_media_platforms_have_to_the_greater_good> [Accessed 14 November 2021].
Wikimedia Commons, 2011. Eli Parisner. [image] Available at: <https://commons.wikimedia.org/w/index.php?search=Eli+pariser&title=Special:MediaSearch&go=Go&type=image> [Accessed 14 December 2021].