Filter Bubbles

In a previous post, David brought the concept of the “filter bubble to my attention. While I was partially aware of the concepts behind the filter bubble I had never investigated the effects of them in detail.

Filter-Bubble-Over-Personalised-Internet-Behrouz-JafarnezhadA filter bubble is usually the result of systems that personalise the information that is viewable to individual users. Filter bubbles are encountered when using services including Google Web Search (Google), the Facebook social networking site (Facebook), YouTube (Google) and Windows 10 (Microsoft). These companies, as well as many others, attempt to provide more “relevant” and “personalised” information and content based on what they know and assume about individual users. This filtering is usually done to achieve more efficient and enjoyable experiences for users and increase the chances of users encountering content that they are likely to spend money on. Customers should be happier and corporations should be more profitable.

Unfortunately, as outlined in the youtube video below, this is not always what users want or in their best interest.

I find it extremely uncomfortable thinking that corporations have control over what content I am able to see and not see. Why should they decide what news or social causes I should be made aware of? Corporations and governments have their own interests that may conflict with my own. Taking this into account, I don’t believe it is an intelligent course of action to rely on them to recommend what content is most relevant to my own needs.

Another interesting concept attached to filter bubbles is the idea of the “echo chamber”.  “Echo chamber” is a metaphorical term to describe situations where information, beliefs and ideas are  amplified through repetition within a closed environment. Filter bubbles create closed environments limiting the diversity of discussion users are exposed to within online communities. Studies have found that this promotes more extreme views and lowers open intellectual discussion (Del Vicario et al., 2016).

So how can we avoid filter bubbles? One suggestion is to use anonymous modes that are built in to most web browsers. This can hide your identity from websites, limiting their ability identify and make assumptions about you. Unfortunately, online tracking has become much more complex than many are aware of and there is no way to truly know if you are still browsing within a filter bubble or not. As a student, I think it is important for me to become more aware of who is supplying the information that I am exposed to, what their motivations are and what kind of liberties they are taking to “personalise” my experiences.

Update: I’ve just noticed an earlier post by Brigitte touched on this same subject. Her analysis brings up some excellent points.

Reference

Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., & Caldarelli, G. et al. (2016). The spreading of misinformation online. Proceedings Of The National Academy Of Sciences, 113(3), 554-559. http://dx.doi.org/10.1073/pnas.1517441113

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s