Filter Bubbles, Effects on Society and Privacy Online

The author and activist Eli Pariser, is the one who coined for the first time the term filter bubble, to introduce a phenomenon that is happening online.

This expression represent a situation that occurs online when the internet user encounters only information and opinions that are already matching his own beliefs, caused by algorithms that personalised the user experience.

When we are surfing online, everything that we are doing, starting from the choice of our device, the networks we connect to, the browsers we use and even our geographically location, all influence our results online, even if we cannot see it happening.

Basically, your filter bubble is your unique font of information online, what is inside of it it’s based on who you are and what you want to see, but you don’t get to decide what gets into your bubble and, at the same time, you are not conscious of what is outside of it.

Even if we all search something on internet at the same time, from the same networks and locations, we will all have different results, because the internet is showing us what it believes that we want to see.

It should exist a balance about the results online, but in the exact moment in which you click on a link instead of another, you are giving more information to the algorithms that are alterating the online balance of information. All of the results that you will see online from now on will be in a way related to the information that you clicked on before.

The biggest problem of this algorithms is that they don’t have a civic responsibility, meaning that you will never be able to escape your filter bubble just because they won’t let you, by not making you aware of the other points of view online.

“It will be very hard for people to watch or consume something that has not in some sense been tailored for them”
– Eric Schmidt, Google

Filter bubbles also play an important role in what is considered companies’ profits. Our data, once is online, it becomes part of an economic system, where it is sell to third parties, by violating our privacy and at the same time, slowly taking away our free will to decide what we want to see. Companies like Amazon or Netflix, reported that the recommendation system is what benefice their sales most.

So, what are the effects of filter bubbles in our society?

  • Effects on democracy

The possible effects on democracy revolve around the idea that personalised communication might have effects on the public sphere. In a democratic society being exposed to different opinions is part of a personal development and growth, otherwise you will face the risk to enter a spiral of extreme viewpoints.
The fear is so that people might forget the existence of other opinions and stuck on rigid positions. Also, if people isolate themselves, the common experiences that work as a “social glue” in a democratic society might diminish.

  • New gatekeepers and influencers of public opinion

The newer information intermediaries, like search engine providers and social networks, differ from traditional gatekeepers, that used to safeguard public policy goals like: media diversity, public debate, and competition in the marketplace of ideas.
The distinction is in the mechanisms that this new information intermediaries use to exert control. A good example is a Facebook experiment that influenced users to vote in the US election (2010), showing the significant impact of these new opinion influencers. The results indicate a substantial increase in voter turnout, both directly and indirectly thanks to social contagion.

  • Autonomy-related concerns

Personalised communication is seen as potentially limiting people’s autonomy by only allowing them to see certain things without their awareness of the influence. However, there is a theory that personalised communication, has the potential to give autonomy, because individuals have the freedom to express their preferences and facilitate a broader range of choices between what they have selected.

  • Lack of transparency

The problem of filter bubbles and personalization is the lack of transparency. If users are unaware that they are viewing personalized content, they may assume that they are seeing the same content as everyone else. It is important to promote the transparency of the media, allowing people to have ethical and correct access to information.

In conclusion, in both public policy and academic discussions, there is a level of concern related to filter bubbles and personalised communication, as it represent a polarisation of opinions.

Since the web was designed to bring people together and make knowledge freely available, there is clearly a relationship between Eli Parisers’ concept of filter bubbles and the principles of A contract for the web.

Principle 3 → Respect and protect people fundamental online privacy and data rights

This is a request for the governments to protect the privacy of the users, granting them the right to access, rectify, and erase their personal data.

Demands should be made under defined laws, subject to judicial authority, also encouraging companies to minimize their data collection in the public interest, promoting a transparency through public registers for data sharing agreements.

Online experiences are therefore based on legal foundations of user rights and it is the government’s responsibility to monitor data.

Principle 5 → Respect and protect people’s privacy and personal data to build online trust

This principle is for the companies. It is fair for people to granting individuals control over their privacy and data, providing clear explanations of their purposes, giving back to people the possibility to decide who can have access to it, and minimising the collection of data from companies.

Also, It is true to think about the fact that we don’t actually know what is going on with our information once they end up online, that’s why it’ll be important to keep track of them and know who can have access.

Otherwise we will never be able to “escape” our filter bubble and have access to everything that the web is offering to us. We are just receiving a small part of information, rather that what we could possibly see, and we don’t even know what’s the reason behind it and what part of our data is taken in consideration, causing this lack of information online.

Principle 8 → Build strong communities that respect civil discourse and human dignity

This principle is focusing on the citizens, and again on the the importance of privacy and the creation of a more inclusive web. It’ll be very important and it’ll be our mission to educate the next generations about the use of web, showing them it pros and cons, and how important it is to protect their privacy online, to create a safe and organic community in which everyone can fell included. 

References:

Leave a Reply

Your email address will not be published. Required fields are marked *