Thursday, 12 September 2013

Filtering the Web (by Simon Wakeling)

The emergence of the internet has allowed its users a previously unimagined access to information. A corollary effect of this information deluge has been the notion of information overload – the difficulty those users face finding, filtering and processing relevant information from amongst the sea of available content. The web services that have emerged to connect us with this information, from search engines to social networks, have in recent years increasingly turned to personalisation as a means of helping users successfully navigate the web. 

In many cases this personalisation takes the form of algorithmic content filtering, with implicit user feedback channels ranging from geography, to click history, to the activity of other users within social networks used to determine the information presented to the user (indeed it has been suggested that Google uses up to 57 different data sources to filter its content). An unintended consequence of this process, which it should be noted is rarely made explicit to the user, is to gradually and invisibly disconnect information seekers from contrary content. A much used example concerns the political liberal, whose regular consumption of left-wing sources and active online network of similarly minded friends eventually sees any conservative material excluded from his search results and news feeds. The result is a “filter bubble” – a term coined by Eli Pariser to describe the self-perpetuating results of the information intermediator’s algorithm. Pariser sees web users inhabiting “parallel but separate universes” in which the absence of the alien undermines objectivity.  

It is interesting to note that philosophers have already begun to engage with this issue. From an epistemological perspective, Thomas Simpson has noted that in an age when search engines are increasingly seen as “expert”, an inbuilt and invisible confirmation bias increases the potential for beliefs to be formed on the basis of “reasons unconnected with the truth”. Others have begun considering the ethical implications. Is the filter bubble a form of censorship? Do we as users have a right to expect objectivity from the systems we use?
 
These are interesting questions for Information Scientists to consider. First though, as scientists we might want to investigate if indeed the phenomena exists, and if it does how it can best be measured and its implications studied. Is it necessarily a bad thing? We might also begin imagining solutions. An example of work already undertaken in this area is Balancer, which presents a stick man leaning under the weight of the “liberal” or “conservative” news sources the user has accessed.  It’s certainly novel, despite its limitations and acknowledged imperfections. Perhaps though the significance of the Filter Bubble problem requires more revolutionary solutions. We just need to work out what they are…

Monday, 9 September 2013

Decision making for microbes

A long, long time ago, in a life far far away, I spent several years working as a researcher in the life sciences.  When I shifed into information sciences, I found myself looking at aspects of the subject from a biologist's perspective.  

One of the aspects that I found myself considering was that of decision making.  Living things benefit if they can make best use of the resources that surround them.  To do so, they need information.  Or so I suggested in a paper that I wrote ten years ago (Madden, 2004) in which I argued that when organisms evolved the ability to move (and possibly before), they evolved a need for information.  

As is always the case at the discussion meetings, the talk went in some interesting directions.  Unsurprisingly I was pulled up for my lax use of the term "World View".  I had quoted Checkland's assertion that "Judging from their behaviour, all beavers, all cuckoos, all bees have the same W[orld View], whereas man has available a range of W[orld View]s." (Checkland, 1984, p218).  It was, I conceded in the discussion, a questionable assertion that I had not questioned.


Checkland, P.B. (1984). Systems Thinking, Systems Practice 2nd Edn. Chichester:

Wiley.
Madden, A. D. (2004). Evolution and information. Journal of Documentation,60(1), 9-23.