Google, Facebook, Yahoo News, New York Times and many other services that provide web searches, filters or social networking are increasingly serving up search results that are in line with your existing opinions and values. There’s an invisible shift in how information flows online. Eli Pariser’s TED Talk and a Guardian article have brought this to my attention and the dangers behind this new phenomenon.
What is going on?
There’s an algorithmic editing that is being built into web searches and social networking feeds. Your clicks, internet behaviour and many other pieces of information about your online presence are collated in order to ‘provide you’ with a customized personalized feed. Two different people at two different locations searching the same terms in Google may get very different search results depending on their political persuasion, habits, outlook on life, hobbies etc.
There is no ‘standard Google’. We are all sitting in our personalized bubble of information that filters out unexpected pages and people that do not directly correspond to our internet consumption patterns. Our online ‘consumption’ is increasingly tailored to what companies think we want, which is not the same we may need as citizens, and here’s the terrible clash and ominous shift.
The Internet is becoming less a device for connecting you to the world and more a device to wrap you up in a predictable bubble that reflects back your own prejudices. This is worrying.
It’s worrying because being a citizen in a democracy means that we need to deliberate on shared issues and be exposed to opinion and information that doesn’t always agree with our already held beliefs. If it does, then we are increasingly blinkered and ignorant about opposing ideas and a diversity of views. The very foundations of liberal democracies rest on the idea that we deliberate, debate on shared issues and get exposed to each other’s views that may directly challenge each other. Without this no democratic deliberation can occur.
Sitting inside our customized filter bubble means you never even get to see what is edited out of your views! Invisibly these disappear.
There’s a struggle going on underneath, Periser says, between our future aspirational self and our present self. By showing you results that reflect your immediate choices your ‘aspirational choices’ do not get a chance to be seen by you! It’s like your bubble keeps serving up the ‘junk food’ of information that your evolutionary self might click on straight away and edit out the more complex or challenging choices that you may choose upon further reflection. This skews your choices to ‘junk’.
Google and others are now algorithmic gatekeepers that limit what we can be exposed to, except unlike editors of newspapers who were visible gatekeepers whose values you subscribed to, these gatekeepers are invisible and reinforce values that you aren’t consciously choosing. This will likely have deleterious effects we don’t want.
We need to see points of view that are uncomfortable, challenging and represent a range of values. This is essential for a functioning democracy. We need to be aware of the filter bubble and its potential to lull us into a reality that may suit corporations or us as consumers, but doesn’t suit democracy and us as citizens.
We need codes that include a sense of public life with its diverse and sometimes challenging values and our own civic responsibility that can only function well when it is continuously exposed to such a range of values. We also need transparency about the editing process and we need to get the agency back so everyone can decide for themselves whether to live in a filter bubble or not!