In the spring of 2011, Eli Pariser in his book The Filter Bubble: What the Internet is Hiding From You introduced the phenomenon of Filter Bubbles, where algorithms, search engines, and social media applications tend to present to you a distorted picture of the world around you. The promise of staying connected to “real-time events and world happenings” that many social media claim to provide seems not so true. The main aim of such platforms is to present users with content they are more likely to interact with. This will result in higher user engagement and generate ad revenue for the platform.
Read: The Psychology Behind Viral Trends and Why we Follow them
What are Filter Bubbles and Echo Chambers?
The theory of filter bubble claims that algorithms, social media networks, and search engines personalise the things you see on your screen by taking into account various factors like geolocation, search history, gender, etc. Eli Pariser states that Google looks at at least 57 signals — from what browser you use, the type of computer you are using, and even what language you predominantly speak and where you’re located. These signals are used to create the query results. Then how can we know that the information we see on the internet is standard, unbiased, and the same for everyone?
Personalization Technology
Pariser identified personalization technology as the primary mechanism of filter bubbles that the algorithm provides. It strengthens individual preferences for seeking out opinion-reinforcing information and excludes opinion-challenging information (Frey 1986; Garrett 2009).
Therefore, you find yourself in a bubble. An enclosed space or personalised matrix that filters information based on your past behaviour and preferences where algorithms decide what type of information you’ll see in it (primarily those that reinforce your existing beliefs and opinions, rather than challenging them). This means that social networks don’t give you a balanced flow of information.
Echo Chamber
An echo chamber, commonly understood as a place where one’s sound reverberates, is a concept similar to filter bubbles but has certain distinctions. In news media and social media, an echo chamber is an environment or ecosystem in which participants encounter beliefs that amplify or reinforce their preexisting beliefs by communication and repetition inside a closed system and insulated from rebuttal (Riaz et al., 2023).
Read more: Social Media and its Social Norms
According to selective exposure theory, individuals prefer to consume opinion-reinforcing news sources over opinion-challenging ones (Frey 1986). Individuals therefore play an active role in curating their feed and timeline. They block those who challenge their views and follow those who reaffirm them. Similarly, they are also being subjected to this same behaviour by those who find their view as opposing. And thus, we have an echo chamber. Fragmentation of social groups, where individuals affirm their existing beliefs and engage in confirmation bias.
It should also be noted that concepts like filter bubbles and echo chambers are often defined in the context of what they lack (diverse viewpoints and information) rather than what they produce or cause. In other words, they are often discussed as problems that result from the absence of a perfect information exchange rather than as phenomena with distinct characteristics and outcomes.
Filter Bubble and Echo Chambers – A Pressing Concern?
With the understanding of filter bubbles and echo chambers, a bit more clear and concisely, we can now ask ourselves the question of why this is such a concern in this digital age.
Well firstly, if we’re existing in our filter bubble, surrounded by information that we already agree with, how can we be trusted to make informed and objective decisions when our information diet consists of only what we agree with? The personalization of algorithms happens without much consent from the user, hence you unknowingly become a victim of confirmation bias.
Echo Chambers are primarily a concern because of group polarisation — the adoption of extreme views when in a homogenous group with people who share the same opinion. Sunstein (2001) predicted that algorithmic filtering would also lead to group polarisation on a larger scale. In an echo chamber, discussions can become increasingly radical and this deepens the divide between different ideological groups. It also blocks constructive dialogue and compromise.
Read Influence of the Influencer: Behind the Social Media Curation
Can a society thrive when its members are driven further apart by their own digital echo chambers and filter bubbles?
Filter Bubbles and echo chambers incite the human psychological tendencies to give in to cognitive bias, closely linked to social identity theory and fuels group polarization.
They often play a part in the proliferation of online hate. With these confined spaces it becomes easy to share discriminant views, hate speech, and incite violence. The anonymity and online disinhibition effect allows people to engage in hate without thinking about the real-world consequences. Furthermore, since they are usually in a bubble where their views are affirmed, they do not see any wrong in this. Filter bubbles and echo chambers reinforce these views.
Conclusion
Psychology plays a pivotal role in the creation and persistence of filter bubbles and echo chambers. These digital environments exploit cognitive biases and social dynamics. When it comes to online hate and the dangers of the internet, these phenomena are accurately contained within filter bubbles and echo chambers, as they serve as breeding grounds for extremist ideologies, and echo chambers can lead to the normalization of harmful behaviours. Addressing these issues requires understanding the psychological mechanisms at play and developing strategies to counteract their negative consequences.
Leave feedback about this