The Online Safety Bill: a step towards digitally safe mental health support
The UK government’s Online Safety Bill has been a hot and contentious topic of discussion over the past year, but last week it was finally announced and included in the Queen’s Speech. For Mental Health Awareness Month, Tim Barker looks at why a new bill could be a step towards digitally safe mental health support.
There’s been a heavy reliance on technology during the pandemic for home schooling, home working and some semblance of social connection. In addition, digital connectivity has also played a key role in supporting the NHS and helping the UK government to deliver efficient public services. In an interview with Andrew Marr last summer, UK health secretary, Matt Hancock stated that during the pandemic 50% of patients had turned to video conferencing for GP appointments and accessed online services. And this has also been evidenced in the acceleration of a switch towards digital when searching for mental health support.
The Kooth digital mental health platform alone has experienced an 89% upsurge in usages from our adult service users, and for children and young people it’s increased by 42%. Over the past 12 months, user data shows 17% of adults accessing professional intervention say that they think about hurting themselves or feel suicidal nearly every day – a 40% increase on the previous year. Amongst 10 to 13 year-old service users with presenting issues, self-harm issues are up by 33% and suicidal thoughts are up 54% on last year.
The increasingly digitally connected world has put an even heavier focus on how the nation interacts with online communities, social media platforms and their content. This shift towards digital has both its advantages and challenges.
From a mental health perspective, we have a duty of care as a nation to ensure that online content is appropriate. This is not just for those with current or emerging mental health difficulties but for everyone – we are all at risk of developing poor mental health given the right set of circumstances and triggers. This is the reason that the online safety bill is so important to the industry.
A mentally healthier UK can only be achieved if we are proactive with our interventions – and this must include online/social media content. It’s therefore vitally important that social media platforms and online sites do not contain inappropriate messaging that promotes harmful behaviours and attitudes. For example, in relation to suicide, self-harm and eating disorders; or contain abusive content in any way.
Let’s also be clear, from a mental health perspective the online safety bill is not about censorship. Content moderation (and pre-moderation) is a matter of defining and applying a set of rules on what is acceptable and what is not. It is something that sits – or should sit – at the heart of any digital mental health service provider; any provider worth its weight in gold will ensure that all community content is pre-moderated. Mental health platforms that are fully safeguarded can maximise the safety of individuals as well as the wider community.
However, beyond moderating content for appropriateness, one of the pertinent aspects of moderation is being able to identify user-generated content, comments or questions that signifies a person in distress.
To truly embed safeguarding there needs to be responsive action, again as a duty of care. It’s not enough to just ban content from publication, platforms must also respond and signpost the content creator to the support they may need – it could be a call for help.
It doesn’t end there. Collaboration is needed between all healthcare providers – including the digital mental platforms – social media platforms and others to support efforts to make online platforms safer. Often technology providers of digital mental health services can be an invaluable source of information with their decades of experience in this domain and in-depth understanding of the ‘digital disinhibition effect’ – which is important to consider alongside more targeted victimisation attempts.
The UK government’s online safety bill – specifically the importance of content moderation – is only one part of a bigger discussion on how we support and protect the nation’s mental health. And how we ensure that no one is left behind.
To address the mental health and wellbeing needs of the nation, choice and diversity in how people access mental health support must be embraced – there is no longer a one size fits all approach to care. There must be a focus on providing support outside of the clinic, or the therapy room – meeting the needs of the nation by providing support when and where it is needed. Digital support should be embedded into wider support structures to provide a safety net for people, but we need to make sure that any content is safe and informative and signposts those in need towards help.
If the UK government’s online safety bill protects even just one person from turning to harmful content and online sites relating to suicide and eating disorders, then it’s done its job.
About the author
Tim Barker is CEO of Kooth. He has over 30 years of experience in the B2B software industry, helping to build and scale SaaS industry leaders. In his journey from software engineer to CEO, Tim founded Koral (acquired by Salesforce) and led EMEA Marketing at Salesforce to scale them to become a billion dollar business. He was previously CEO of DataSift, a privacy-by-design analytics and AI platform, acquired by Meltwater in 2018.