The internet — once hoped to infinitely expand our mental horizons and our exposure to challenging ideas — now seems more likely to confine us to our prejudices.
When you search in your web browser today, for any given term, your search engine (Google, for the overwhelming majority of Australians) retrieves pages that it thinks you will be most inclined to take an interest in, based on your personal search and browsing history.
Day by day, with each moment of online interaction, search engines are etching a more detailed portrait of our interests, and filtering out the world beyond those interests.
Personalised search means that when two people type an identical term into google, the results displayed could be quite different. It means, on the plus side, that Google's results are ranked by specific and contextual relevance rather than just by what other people have clicked on.
Meaning, for example, that typing in 'I like' no longer autocompletes to 'I like to tape my thumbs to my hands to see what it would be like to be a dinosaur'.
On the other hand, it could mean that your web history influences your web present.
Personalised searches are not a deliberate attempt to censor information, although search engines have occasionally been found to engage in such practices (see, for example, Siri's suggestions for abortion clinics). Your search is personalised with the best intentions of filtering out the masses of irrelevant information and presenting you with the pages most relevant to you.
And indeed, the results that your search engine provides you with are dependent mostly on your tastes: the pages you have visited, the search terms you use, and links you have clicked.
More recently, Google also incorporates information from your social network (only Google+ at this stage) including links, photos and comments. Google cookies diligently squirrel this information away every time you use the web in an effort to bring you more relevant results next time you search.
Not only your searches are filtered in this way. On Facebook, posts from friends whose viewpoints you share or whose updates you dwell on are privileged to the exclusion of posts that do not interest you.
For me, this means I receive proportionally more content from a younger friend whose posts are so misspelled and grammatically confused that I spend minutes with my cursor hovering above the letters, attempting to make sense of her thoughts. I no longer see posts from conservative or devout friends, or those who constantly post about their children, because Facebook knows I don't care.
In fact, almost my entire news feed is populated with the latest polls, articles in trade journals, and features from Eureka Street.
The hard-working algorithms of the net are not trying to limit us. They are mirroring our behaviour and preferences, and encouraging us in our specific interests. The problem is that in having our tastes reflected back at us, we become more and more narrowly defined and cut off from the diversity of interests available to us, and the great potential of the web for sharing perspectives is lost.
I've contributed to the state of my news feed through my own actions: following political groups and journalists and clicking their links. But it's bad enough that I let my work get in the way of taking an interest in my friends and family — I don't need Facebook to do it for me.
Moreover, the state of pubic discourse — online or offline — is already fractured, with opinion and commentary substituting for journalism in most of our news media. The increasing personalisation of web content may sound the final death knell for news reported objectively to meet the needs of the broad population, as news can be segmented by target audience to maximise click rates.
In this environment, commentary reigns: one person's Annabel Crabb is another's Andrew Bolt. Google's role in this is minimal: it merely wipes Andrew Bolt from my news feed, deeming his opinion irrelevant to my interests. While for that I am forever thankful, the wider implications are disturbing.
When I digest information written in alignment with my own views and you with yours, we both lose the opportunity to have our views broadened, challenged or changed. Worse, exposed predominantly or exclusively to my own views and the views of those like me, my position is reinforced and perhaps tends to the extreme, and I become unsympathetic to alternative perspectives.
That is bad enough for a rather uninfluential private citizen such as myself. Consider now that Annabel Crabb recently wrote a blog noting that her research is likewise constrained by Google's interpretation of her interests.
At present, the effect of Google's personalised filters is not dramatic, and the option of disabling personalised search is available in both Google and Facebook.
But as the algorithms for tailoring personalised content become more sophisticated, as mobile devices become more pervasive and as content becomes more plentiful and specific, there is potential for the isolating effect of our own preferences to become greater.
A web advertising expert recently told me of efforts underway to develop retinally-projected digital media. Advertisements, targeted to your location and purchase history, will be projected directly to you, invisible to others, from the personal device through which you view your world.
How much of our concern for a shared society will we lose when we no longer share even a sensory experience of the world around us?
Edwina Byrne works in media and communications .