Yonsei University Graduate School of International Studies

The Echo Chambers of the Internet

(Technologies such as social media) lets you go off with like-minded people, so you’re not mixing and sharing and understanding other points of view … It’s super important. It’s turned out to be more of a problem than I, or many others, would have expected” — Bill Gates

After the Brexit vote and Trump’s election, huge segments of the public were in utter disbelief of the results; many just assumed that things were going to turn out differently, as everyone had been predicting all along. On the other hand, large groups of people were content that the outcome was just as they had predicted. The division between the disbelief and content was extreme, as if some invisible force had purposely held back information of the bigger picture.

Roughly a decade ago Facebook, among other Internet innovators, became highly interested in content-based recommendations, which is a subcategory of information filtering. Filtering itself is made necessary due to the increasing flow of information and has, as an activity and function, existed throughout human history. For example, when a government wants to exclude harmful material from society, it acts as a gateway to wider audience, filtering the material it deems unfit. The Internet, however, has made this phenomenon more hidden, more inconspicuous.

Facebook’s interest in filtering methods increased as the site and content within the platform grew. Shortly after the social media site introduced a newsfeed, they created a filtering algorithm as a response to the growing numbers of daily posts on Facebook. Without a filter, a single Facebook account can get more than 1 500 posts to its daily newsfeed, and therefore the content that users might prefer seeing would be likely buried in the mass. The algorithm was developed to direct the flow of posts so that what was deemed most relevant would show up and those less likely to be “liked” would be hidden. While on the surface it is useful, these algorithms have often led people to feel estranged from groups with different interests and opinions. As Mark Zuckerberg has said, “the squirrel in your court yard might be more relevant to someone than the dying people in Africa”.

Because Facebook has now become a main source of news to many of its users, users’ newsfeeds are therefore likely to create their comprehensive understanding of the world today. It may not be only the squirrel in the yard, but a whole way on how and what current issues are presented to the users, that create the so-called “bigger picture”.

Journalist Veli-Pekka Hämäläinen did a  test on such filtering that showed how effectively and fast the algorithms on Facebook altered the information reality users experience. In the experiment, a profile created by the journalist was made to follow main news sources. Every time Facebook suggestions came up, it would “like” the new pages. After a while, the profile started to search immigration issues, especially protests that had occurred.  By liking more racist groups and sources, within a time span of two weeks the newsfeed no longer shoved anything that main news sources were posting, despite following them. Instead, it was filled with racist groups, questionable sources and threatening posts.

It would be wrong to think that this is only an issue for those who lean conservative in their political or social views. The shock followed by the results of Brexit and Trump elections demonstrates that those on the left who thought themselves as open minded and diverse in their information consumption, also have their own echo chambers formed by their newsfeed.

As the results sank in, some have investigated why they were missing information.  However, the filtering is not a feature that can be just turned off, as who you follow, who are your friends, what links you click and how long you stay on a site are all parts of an algorithm that tries to determine your preferences despite who or what you are. The results of filters, not only on Facebook, but on other sites using similar filter algorithms – Google, Twitter and Spotify – vary highly on the user’s earlier habits.

Although as an idea it seems good that people see what they prefer, sometimes the results are disturbing in their effectiveness and skewed sense of reality, as seen in Hämäläinen’s test. In 2011, Eli Pariser, an ITC professional and an active follower of politics, highlighted the point that creating such filtering bubbles around unaware users leads to more close-mindedness. Moreover, as Professor Tomas Chamorro-Premuzic argues, although humans more readily accept information that is parallel to their beliefs, we need information that is contrary to our beliefs to be able to accept differences in a pluralistic society. It has also been noted in studies that if children are disposed to conflicting information, they are capable of more flexible and creative thinking. On the other hand, if they see only information of a similar type, especially one parallel to their own opinions, they will encounter problems understanding other perspectives.

Alarmingly, it is easy for users to be unaware of how much their feed is filtered, why it is filtered and how they can avoid these echo chambers. Also, as competition is intense among service providers, filtering material is an attractive method for a growing number of Internet-based services, from search engines to dating agencies. It also was used to more efficiently entice businesses and parties – racist or political groups for example- as a way to promote their commercial interests. So not only can the user feed become a bubble of similar minds talking about the same things, but a path for ads, clickbait content and fake news to find their audience. Facebook and Google have developed a very successful way to make money and gain power from people who may be largely unaware what kind of rights they have or how to keep filtering algorithms in check.

Recently, Facebook stated that they are taking measures to create wider, more balanced newsfeeds and authentic content being highlighted. They have also offered ways to manage the filtering. However, major damage has already been done, especially to those who have grown dependent on the information provided by their newsfeed. As people are more skeptical and unwilling to believe something contradictory, the updated, neutral newsfeed may lead to some emotional outbursts. The shocking results of 2016 may have made us globally aware and critical towards the big Internet players. Yet end users, the actual individuals using social media, should also become aware of their filtered bubbles and do their part to avoid being trapped inside an echo chamber that represents but a fraction of the bigger picture.

Written by
No comments