Championing original and authoritative research


Seargeant and Tagg on Fake News and Digital Literacy

Philip Seargeant and Caroline Tagg, authors of Taking Offence on Social Media, respond to the Digital, Culture, Media and Sport Committee's interim report on its investigation into 'fake news'.

On 29 July the Digital, Culture, Media and Sport (DCMS) Committee published its interim report on its investigation into ‘fake news’, with a lengthier version due in the autumn. Much of the report’s focus is on the reform and regulation of electoral practices in the era of social media. However, as language and education specialists, we are pleased that it also recognises the need for ‘a unified approach to digital literacy’, including changes to the school curriculum and a public information campaign. In this article we reflect on the findings – and limitations – of this aspect of the report, focusing specifically on the role that higher education can play in tackling the phenomenon.

The DCMS’s enquiry was set up in January 2017 to look at ways of combatting the ‘widespread dissemination … and acceptance as fact of stories of uncertain provenance or accuracy’. As the enquiry developed, a particular concern for the committee became the way that data is used and shared, and its exploitation for purposes of propaganda. For this reason, much of the report focuses on the influence of technology – and the companies and organisations which work with this.

In an interview we conducted with the select committee’s chair, Damian Collins, he pointed to areas where he felt government was able to act to combat the current situation. These include transparency around online information, data protection laws to ‘check that [companies are] holding data in a way that complies with the law’, and ensuring that social media companies have a legal ‘responsibility to curate that space in a responsible manner’. These issues are now explicitly laid out in the report, with most recommendations involving changes to electoral law and regulation of social media companies.

But technology is only one part of the equation. It is also important to understand why people share false stories, and the effect this type of misinformation actually has on people’s actions. After all, the spread of misinformation online is related to how people use sites like Facebook – and this is shaped by the fact that Facebook is, first and foremost, a social space.

The report recognises this, citing the evidence we gave to the Committee in January that ‘to many people Facebook was not seen as a news media site, but a “place where they carry out quite complex maintenance and management of their social relationships”’. As our research shows, when people post to Facebook they potentially address a range of different social ties, from close family members to colleagues and acquaintances. It can be a tricky process to manage these various relationships all at the same time while not offending or upsetting anyone. Because of this, what someone shares or likes is often determined as much by the ties they have with their network as by a strict evaluation of its credibility.

For this reason, as we argued in our own evidence to the committee, any solution to the problem needs to include educational measures alongside technological ones. In line with this, the report rightly recommends that digital literacy become ‘the fourth pillar of education, alongside reading, writing and maths’ in the school curriculum and that this requires co-ordinated action between the Departments of DCMS and Education, funded in part by an educational levy on social media platforms.

This is all very welcome. One limitation, however, is that the measures focus too narrowly on data management and technology’s role in the spread of information. Our research suggests that, along with the current recommendations, education should also include what we call social digital literacies. Alongside traditional digital literacies skills, we need to provide greater critical awareness among the general public of how our social interactions and relationships play an important part in influencing our decisions regarding what to share or like – and how this in turn can contribute to the circulation and visibility of news in the online environment.

A second limitation of the report is that the educational measures it sets out fail, as yet, to envisage a role for higher education in equipping people with the digital literacy skills necessary for tackling fake news. This can hopefully be reconsidered, not only because UK universities have a great deal of experience in teaching digital literacy skills, but also because they are ideally placed to provide this type of support. One key area taught in higher education is precisely that set of critical reading and thinking skills that social media users require if they are to learn how to evaluate online news and identify false information, and to appreciate how these might be shaped by their social concerns. For the recommended ‘unified public awareness initiative’ to be successful, it needs to include this critical element which will enable people to adapt their skills both to changes in technology and to developments in the on-going attempts made to mislead them.

Dr Caroline Tagg is Lecturer in Applied Linguistics at the Open University, UK. Philip Seargeant is Senior Lecturer in Applied Linguistics at the Open University, UK.