Updated: Nov 9
Author: Giorgia Piccolo
Date of Publication: 28/07/2022
Filter bubbles are filtering logics introduced thanks to algorithms. These are increasingly opaque, in fact we do not realize that they are within social networks. These define the online mechanisms of information polarization. The last is a product of the logic of social media algorithms and search engines such as Google. Actually, the attention is on the filtering logic introduced by the affordances of platforms and digital environments.
How do filter bubbles work?
Filter bubbles filter the news and behaviors that we perform within social networks. Consequently, offering us news and content suitable for our passions and our research. Thereby the algorithm will never offer us something that we have not seen or looked for.
Moreover, algorithms play an essential role in spreading these cultural objects on the web. It also prepares the tools we use to search for and retrieve them. They have become essential for analyzing and processing the huge amount of data generated by social media. These also come from the tracing of their actions online: a participatory platform that grows and evolves through its use.
We see what we like
Next-generation filters determine the things we like based on what we do or what people are interested in. The next step is to extract information to make predictions and refine a theory about who we are and what to do. Overall, they create a specific universe of information that for each of us is a "bubble of filters". This alters the way we come into contact with ideas and information.
It starts from the concept of selective exposure but differs in:
Hyper-personalized context: the centrifugal process operated by the filters leads to isolate oneself within the perimeter of one's selections;
Selective invisibility: user ignores and doesn’t understand the reasons why they found some results instead of others;
Passive entry into the bubble: we find ourselves within a context determined beyond our conscious choices. These are a creation by filtering logics that we don’t know.
Algorithms represent the engine of this selective action by making some contents more visible than others. However, in the meantime they make invisible principles on which they are based.
Two types of personalization
Borgesius distinguishes between two types of customization:
Self-selected personalization: how people tend to seek opinions consistent with their own while avoiding those that question their point of view;
Pre-selected personalization: people don’t always consciously choose which opinions to access, as in the case of the filter bubble.
It is clear how self-selected personalization can introduce margins of freedom with respect to the dynamics of content choice. Even the pre-selected personalization can help users make more diversified choices than the decided offer of a broadcasting channel. The study collects various researches and explains how there is no counter empirical evidence relating to the fact that personalization produces a closing effect. Furthermore, there are two forms of personalization that tend to act simultaneously, producing counter balances between different exposures, thoughts and orientations opinions congruent with their own.
This article is a result of the personal experiences of the writer.