#SocSciMatters

Championing original and authoritative research

B_logo_socialscience

Seargeant and Tagg on Political Discourse on Social Media

Philip Seargeant and Caroline Tagg, authors of Taking Offence on Social Media, discuss the importance of social science research in understanding how social media use creates a ‘filter bubble’ effect.

When Apple updated its iPhone operating system in 2014, it introduced an enhanced form of predictive texting which used an algorithm to offer suggestions about which words you might want to type next. This algorithm goes so far as to make recommendations based on what it knows about the different styles you use in different contexts. As the company explains, it:


"takes into account the casual style you might use in Messages and the more formal language you probably use in Mail. It also adjusts based on the person you’re communicating with, because your choice of words is likely more laid back with your spouse than with your boss."


The influence that computer algorithms are having on the way we communicate became a major talking point in the latter half of 2016. The Brexit referendum in the UK and Donald Trump’s presidential victory in the US drew fresh attention to the power that social media now wields in mediating political debate, and the possible dangers this can produce, especially in terms of the way that misinformation and extremist partisan views get circulated. In the immediate aftermath of the US election, much of the media post-mortem focused on the role that personalisation algorithms, especially that used by Facebook, play in filtering information and determining the types of news and opinion people have access to via their feeds, thus creating so-called ‘filter bubbles’.

As we discuss in this article however, an understanding of what people themselves do on social media – how they organise it as a space for social and, to an extent, political interaction – is equally important when it comes to the diversity of opinion and perspective to which people are exposed. Algorithms may be transforming the way we interact with technology but the action and agency of users is still a vital part of how social media platforms operate as forums for communication.

Personalisation algorithms are key to the consumption of information over the internet these days. It has been argued, however, that an unwanted effect of this is the creation of what Eli Pariser calls filter bubbles – the ghettoization of online communities due to the way an algorithm feeds a user content which they are likely to find appealing, while at the same time filtering out views with which they probably disagree. One of the consequences of this, so the argument goes, is that it facilitates the spreading of fake news because of the way that misleading, biased or downright false information circulates unchecked.

For some commentators, this is primarily a technology problem, requiring technological solutions. This has resulted in sustained pressure for Facebook to implement a raft of measures to try to mitigate some of these issues. But approaching it only from this perspective risks taking an overly technologically-determinist position – assuming that the technology is, in effect, deciding our behaviour for us. We would argue that instead it is important to consider also the significance and detail of people’s interactions with the technology.

Within communications theory an important principle is that you are always communicating with someone, and that you adapt and modify your communication (both its style and content) with an audience in mind. As the opening quote from Apple illustrates, this principle now even underpins the workings of the predictive texting algorithm. One of the notable elements of social media communication, however, is that in many contexts we don’t know for sure who the audience is. A post on Facebook can be viewed by anyone in our network. It could be shared beyond that network depending on how other people interact with it (and how much care we’ve taken with our security settings). And this is true of pretty much anything you send via the internet. As the journalist Olivia Nuzzi memorably puts it: ‘Dance like no one is watching; email like it may one day be read aloud in a deposition’.

Yet even with an unseen or unknown audience we still tailor our messages in the same way. We do this by projecting an idea of who the audience is likely to be and anticipating their reactions, based on our experience of them from past encounters. In other words, we imagine our interlocutor, and shape what we say and how we say it based on the version of their persona that we conjure up in our head. And in shaping the type of communication we engage in on Facebook, we play a part in shaping how Facebook operates as a communicative space.

How does all this relate to filter bubbles? Of relevance to these issues in the context of recent political events is our own research, presented in Taking Offence on Social Media, which suggests that the actions of Facebook users themselves help to create a filter bubble effect and, to a certain extent, close down open debate. One noticeable thing we found in it was that people were, contrary to popular belief, actually exposed to a great deal of diversity in terms of opinions and values due to the fact that their social network includes people from all parts of their life. In this respect the influence of the algorithm on creating silos of like-minded opinion was not that salient. But an awareness of this diversity of opinions and values had an effect on how people shaped their own communication, so that they often tried to express themselves while at the same time not provoking argument or causing offence. In order to manage the communicative space in a convivial way, people often filtered out conflicting views by ignoring or blocking posts, or occasionally defriending people, when confronted with views they strongly disagreed with – rather than challenging or arguing against these views.

Why was this? One reason is that people fear that the online context is liable to lead to misunderstanding, due to the way that written communication lacks the non-linguistic cues of spoken communication, as well as the fact that the network they belong to includes a range of differing opinions and values. They also reported taking care of what they said themselves – as well as how they said it – so as not to antagonise contacts such as family members or work colleagues whose views differed from theirs, but whose friendship they wanted or needed to maintain.

The appeal of the algorithmically-created filter bubble explanation for producing partisan political debate on Facebook is easy to understand, and if the problem is a technological one it should also be easy enough to fix. But what social science research reveals is the part played by users themselves. To put the focus solely on what the technology does or should be doing, therefore, is to abdicate responsibility for how our own actions shape the role of social media in society, as the evidence shows that a site like Facebook is very much what we, as users, make of it.


Dr Caroline Tagg is Lecturer in Applied Linguistics at the Open University, UK. Philip Seargeant is Senior Lecturer in Applied Linguistics at the Open University, UK.