Jo Adetunji is the managing editor at The Conversation.
There has been a long-running argument in media circles over whether big social media are platforms or publishers.
Social media companies take content and with that much of the advertising revenue that would have gone to the publisher. That has had a significant impact on the ability of publishers to make money from their content as platforms such as Facebook can make money from other's content for free.
Being considered a platform also means these companies do not take on responsibility for the content – for example, with legal issues. This remains with the original publisher. Being considered a platform comes with benefits.
So what happens when that content is produced by us as users of these platforms? As trolling, hate speech and distasteful comments become increasing flashpoints, social media platforms are having to wade in to mediate content, typically something a publisher would do. So, where does the buck stop for social media platforms when content is deemed to be hateful or antisocial? And who decides what breaks the rules?
We have never had a better time to air our thoughts and opinions so publicly. In the US, free speech is enshrined in the first amendment. In the UK, freedom of expression was enshrined in law as part of the Human Rights Act, in 1998. This is not, of course, the right to say whatever you want, whenever you want. Blasphemy laws, obscenity laws, not to mention laws against slander and defamation, have all put limits on what the everyday person can say in public. The development of the printing presses at the end of the 18th century saw the wider dissemination of political satire to the masses. But it is today - and enabled by social media - that we can all lampoon at will, share conspiracies, and push the boundaries of what is acceptable or healthy.
And social media is the new court of public opinion. We know what we like and dislike, what we agree and disagree with, but there may be someone out there who begs to differ. Some shout ‘free speech’ while some call for others to be ‘cancelled’. There are some issues we collectively agree on: there is no space for antisemitism, racism, homophobia, and I hope we reach a place where it is clear to all of us when we see it. But when a line is clearly crossed, then who should police it?
Facebook, Twitter, and YouTube recently signed up to GARM, the Global Alliance for Responsible Media, a cross-industry initiative to create a common set of definitions for hate speech and other harmful content and an independent audit of how the platforms deal with this type of content. It remains to be seen how far this will go in tackling the problem.
And yet last month Facebook has introduced the Independent Oversight Board that its users, as well as those of Instagram, can appeal to if they believe their posts have been unfairly removed. It will be interesting to see what impact, if any, the board’s decisions have on Facebook’s content policies.
Culture wars and free speech clashes are not new, and we see them in offline spaces too – universities are no strangers to debates over who should and should not be allowed to, well, debate. The social media platforms did not create this, but they amplified it, and are now themselves in the midst of it.
Our need for fairness makes us want to give equal weighting to every voice. News outlets often strive to feature both sides of any argument, even if there is factual evidence that one side is wrong. And social media platforms have picked up that cause and appear to be running with it. However, this is a dangerous game to play as those with ulterior motives will shout far louder and for far longer than those who favour rational, balanced rhetoric.
Social media platforms must act responsibly. In addition to joining initiatives such as GARM, they must build tools that create better environments for users and avoid building algorithms that set the foundations for division.
But there is a wider social debate going on, one that sometimes seems drowned out on the platforms themselves. We need to be able to see the bigger picture; how we talk to each other, how we debate difficult issues and stay respectful. Social media can bring us together, and it can drive us apart. But it is we the public who should decide what to expect and demand social media platforms do more to create healthier spaces.
Free daily newsletter
- Ethiopian fact-checkers wage an unequal war against Facebook misinformation
- How will covid-19 shape the future of UK court reporting?
- Newsrewired special: emerging social platforms and why journalists should care
- RISJ trust report: redefine your public image or bad actors will do it for you
- Tristan Kirk, court correspondent, The Evening Standard, on virtual attendance at UK courts