Skip to main content

How do filter bubbles shape human ideas?

Man scrolling through Twitter feed on iPhone.

What is a filter bubble?

As a society, human beings naturally wish to form groups in order to fit in. They form groups based on similar interests and then while in these groups, those interests and beliefs intensify. This is especially present on social media. The term ‘filter bubble’ emerged in a 2010 book by activist Eli Pariser to discuss how social media either only show users what they want to see, or what advertisers pay for them to see based on algorithms. In the same vein, social media sites wish to keep their audiences engaged and coming back; this means they tend to show their users information that will make them happy. Sites can look at things like age, race, gender, location, and browsing history to determine which information they will show. Based on ‘likes’ on social media, people’s political beliefs can be observed, and posts that share the same view can be prioritized. This makes it almost impossible for people to see opinions that oppose their own.

Real World Effects

While only being able to see pleasing information and content on social media may seem like a good idea, it creates things called ‘echo chambers.’ This makes people assume that everyone thinks as they do, and opposing views hardly exist. There are many accounts of filter bubbles affecting Brexit and the 2016 United States election. Both outcomes were ones that were not commonly expected: Britain left the European Union and the U.S. elected President Trump. Twitter CEO Jack Dorsey believes that Britons who voted to leave the E.U. were ‘victims of filter bubbles.’ According to a study done on the United States 2016 election, politically polarized people were more likely to turn to Facebook for election news. Another study done in 2015 suggested that more than 60% of Facebook users are entirely unaware of any curation on Facebook at all, believing instead that every single story from their friends and followed pages appeared in their news feed. More recent examples are of groupthink around new internet phenomena like NFTs and artificial intelligence. The rise of NFTs saw a lot of people jumping on the bandwagon only to lose money in the long run. Misinformation about important topics like Covid-19 has also been furthered by the curation of filter bubbles not showing people a wide range of opinions and studies.

Hope for the future

A study done in the UK, Germany, and France found that citizens actually found more political information online that they disagreed with. Another study done by Elizabeth Dubois of the University of Ottawa found that the “majority of people already reach outside their political comfort zone: they actively seek out additional sources that convey diverse views that do not match with their preconceptions.” Ideological echo chambers and filter bubbles on social media are the exception, not the norm; however, this does not mean they are nonexistent. Even though many people seek out other information with varying perspectives and reliable data, having even a small percentage of people living in filter bubbles can be dangerous. To uphold a democratic society, it is required to have citizens with open minds and educated thoughts. Filter bubbles hinder this education, making people believe that their opinion is always the correct one, and opposing perspectives are either incorrect or nonexistent. People must be careful with what sites we give our information to and need to be self-aware of the bubble we are put in; we must look past the bias created by social media.

Leave a Comment

Ready to reach more customers online? Tell us about your project and let's get started.

Request a Quote