The Draft Online Safety Bill (OSB) was published on 12 May 2021 and aims to make the UK the safest place in the world to be online while defending free expression. Amongst many things, the Bill establishes a new regulatory regime to address illegal and harmful content online, with the aim of preventing harm to individuals in the United Kingdom. It also imposes duties of care in relation to illegal content and imposes duties on such providers in relation to the protection of users’ rights to freedom of expression and privacy. The Bill is of significant interest to councils and the LGA will be working with partners to ensure it is as impactful as it can be when formally introduced into Parliament.
Whilst social media companies have implemented some approaches to tackle abuse and intimidation online, there remain widespread calls for social media sites to do more to address online abuse, harassment and intimidation. This in particular focuses on the fact that much of the approach depends on tackling content after it is posted, significant responsibility lies with the person receiving abuse and harassment, and the perceived limited success of the current approaches. One of the challenges of placing responsibility on users to deal with online abuse themselves is that, even if a user ignores or blocks a particular user, this does not always stop that content being published, potentially stirring up ill feeling online.
Evidence has shown that people are promoted content that links with their existing views, leading to ‘echo chambers’ online where abusive content can be shared and amplified. This kind of abusive discourse can then escalate into offline violence, as is believed to have been the case in the murder of five people in Plymouth in August 2021. Therefore, while tools to support users are important, preventing harmful content, in particular violent and threatening content, from being published and shared in the first place by building safety into platform design is essential.
The LGA therefore supports the recommendation of the Draft Online Safety Bill Joint Committee (the “Joint Committee”) to include in the Bill a specific responsibility on service providers to have in place systems and processes to identify reasonably foreseeable risks of harm arising from the design of their platforms and take proportionate steps to mitigate those risks of harm. The LGA calls for this to include explicit reference to users with protected characteristics.
The LGA also supports the Joint Committee’s recommendation that Ofcom should be required to produce a mandatory Safety by Design Code of Practice, setting out the steps providers will need to take to properly consider and mitigate these risks. There have recently been calls for a ban on anonymity on social media to tackle online abuse, with proponents of a ban highlighting that users can feel ‘protected’ by their anonymity and emboldened to say things they would not say in person, while the police can find it difficult to trace anonymous users. Whilst the LGA has sympathy with these calls, we recognise the benefits that can come with maintaining options for using anonymous accounts, from whistleblowing to protecting the voice of those who are not safe to speak out using their own names such as those suffering from domestic abuse or LGBTQ+ young people living in unaccepting homes or communities.
The Joint Committee, following its scrutiny of the Bill, concluded that “anonymity and pseudonymity are crucial to online safety for marginalised groups, for whistleblowers and for victims of domestic abuse and other forms of offline violence. Anonymity and pseudonymity themselves are not the problem and ending them would not be a proportionate response.” It also made a range of recommendations to tackle the challenges posed by anonymous accounts, including a requirement that Ofcom include proportionate steps to mitigate these risks as part of its recommended mandatory Safety by Design Code of Practice.
For these reasons the LGA believes that focussing on preventing abusive content before it is posted, and ensuring appropriate responses to abusive content, is a more appropriate approach to tackling online abuse and harassment than banning anonymous accounts. The LGA therefore supports the recommendation of the Joint Committee that platforms should be required to take proportionate steps to mitigate risks posed by anonymous and pseudonymous accounts.
To improve the experience of all users online, users must be encouraged to be respectful of each other, including where there are opposing views. Improving media literacy is one part of this, ensuring users understand the impact of their posts on others, and are able to recognise the kinds of mis- and disinformation that can spark abuse of others. The other side is improving civility and respectful debate in wider society.
The Government, the Independent Press Standards Organisation and Ofcom should consider how to ensure the media and politics lead by example in relation to civility and respect. Local government recognises its own leadership role here and the LGA will continue to develop its Civility in Public Life programme.