What to do about shareable lies

4 Comments

 

Last week, executives from Facebook, Google and Twitter appeared before three different US congressional committees, to answer questions about how their platforms were used to influence voters in the 2016 elections.

Silhouette in front of Facebook screenIt is a significant moment. Other media like radio, newspapers and television have never been grilled like this, though Fox News and The New York Times surely influence voters, too.

Perhaps one difference is that the nature of legacy media influence is clear. It mimics political binaries in the US. It has also been regulated for decades, including degrees of transparency around advertising, classification and distribution. Political ads, in particular, are subject to Federal Election Commission rules, including disclaimers that declare who authorised and financed them.

None of these apply to social media. For most of its life, it has been perceived as neutral, a mere conduit for streams of thought. The rough convention was that users are the publishers, not the tech company.

No gatekeepers. Anonymity. Content was just content, not endorsements. Traffic or 'engagement' became the primary value. The more speech, the better.

Not surprisingly, it was women and minorities who first pointed out that this was not sustainable. The hardest push for 'Block' and 'Report' buttons came from groups vulnerable to abuse and threats online. This perversely put the onus on individuals to protect themselves, rather than on harassers to not harass, or for forums to be made safe.

It demonstrates how the libertarian principle driving social media, that all speech is equal, means we could hear less and less from those we need to hear the most. As former Google engineer Yonatan Zunger put it: 'If someone can impose costs on another person for speaking, then speech becomes limited to those most able to pay those costs.' Technology ends up replicating socioeconomic differentials rather than dismantling them, as it claims to do. If you can pay, then you play.

 

"We would have to hope that teachers and students are being given latitude when it comes to learning how to engage critically with information online."

 

The hearings reveal how this model has been exploited in other ways. It is no longer in dispute that Russian troll farms, such as the St Petersburg-based Internet Research Agency, spread inflammatory content via social media. The initial sample of paid ads and metadata show that both sides of politics have been studiously targeted.

Facebook handed over 3000 ads and removed at least 470 fake accounts. An estimated 15 per cent or 48 million Twitter accounts are fake or automated. Google traced more than 1000 YouTube videos to the Internet Research Agency.

The political impact of shareable lies can be hard to extract from partisan feeling, especially when it seems to have benefited one candidate. But the truth is that election results can be a poor measure of anything, much less the success of mischief online. Too many variables are involved, not least of which are susceptibilities in the electorate to such messaging. These must be addressed, as they predated the election and will outlast the hearings on Russian 'active measures'.

So much has been made of foreign interference, and not enough about how the social amplification of misinformation could be so immediate and widespread.

The obvious counterpoint to this is education, both at civic and school levels. We would have to hope that teachers and students are being given latitude when it comes to learning how to engage critically with information online.

The more complex part of limiting responses to inflammatory material is good government. It is not as easy to get someone upset about their government, or politicians in general, when they can see that their healthcare, jobs, house and future are relatively secure. An inclusive discourse helps.

Our leaders essentially need to give voters reasons to see lies for what they are, instead of going, 'hey that could be true' when a malicious meme appears on their Facebook feed.

Until this comes to pass, tech giants have choices to make about the content that is circulated on their platforms. They are being exploited by malevolent actors, too.

There is a political case to be made, which will be left up to Democrats and Republicans who have so far been sorely dissatisfied with vague commitments to do better.

An economic case could also be made that an editorial voice, proposed by Zunger, would maximise user engagement and add value. It would certainly keep social media platforms in the mainstream, where the revenue is, rather than the fringe.

But there is also a moral case. Should private companies continue to profit from a supposedly neutral model that puts individuals, and even democracy itself, at risk?

 

 

Fatima MeashamFatima Measham is a Eureka Street consulting editor. She co-hosts the ChatterSquare podcast, tweets as @foomeister and blogs on Medium.

Topic tags: Fatima Measham, Google, Facebook, Twitter


 

submit a comment

Existing comments

Social media doesn’t kill democracy. People kill democracy.
Roy Chen Yee | 09 November 2017


Just like when we watch too much trash on TV, or talkback radio or gossip from friends - the best way to avoid trash talk is to switch it off! And to model this advice for our children is also the best way to deal with it - trash doesn't deserve to be analysed and respected. Just turn off the WiFi and let kids go outside and climb trees and dig in the garden.
AURELIUS | 10 November 2017


And people in every age have employed various contemporaneous means to strut their democratic stuff. We would be intuitively and socially as thick as six planks not to be alive to both the beneficial and harmful impacts of heavy social media use. We would be imperceptive indeed to avert the eyes from the rancid impacts. Thanks Fatima for incisive analysis and your evident commitment to the health and vigour of civil society.
Wayne Sanderson | 10 November 2017


There seems to be little discourse around the issue of the need for readers/users to be responsible for what they proliferate, pass on, give space and energy. I know, it is a long term, hard subject, and it has to be taught as from early age. Unfortunately I seem to notice we are going backwards towards a culture of blaming, rather than educating. Historically people with in power have used their influence to advance their causes, by any means. No surprise there Now, their voices are being amplified and made popular at great speed, by the use of social media in mindless ways. But this is only possible because readers/users are uncritically passing 'cool things' on, without concern about the validity, credibility and even feasibility of the 'news' Choosing what we read, putting in effort to find out whether it is authoritative, credible or false is our individual responsibility as parents, teachers and journalists, etc. etc. as well as readers/, users.. Not to give space, time and energy to gossip but energize what is in line with our principles is everyone's responsibility. I like what you write Thank you Fatima, I look forward to reading your thoughts on the above.
Antonina Bivona | 10 November 2017