Ofcom, the UK’s communications regulator, has today set out more than 40 practical steps that service must take to keep children safer online.
These steps include the requirement for robust age checks to be introduced on websites and apps to prevent children from seeing harmful content such as suicide, self-harm, and pornography, as well as the filtering or down-ranking of harmful materials in recommended content.
Responding to the announcement, Operations Manager for Catch22’s The Social Switch Project, Richard Smith, said:
“The introduction of stringent age verification processes and the requirement for social media platforms to tame aggressive algorithms are pivotal steps towards protecting young users from harmful content. Measured that will prevent children from being added to group chats without their consent on platforms are particularly welcome. Giving under-18s the ability to block and mute accounts, and disable comments on their posts, empowers them to have more control over their social media interactions and protect themselves from abuse.
“Strong regulation is vital, but so is education. Social media isn’t going to go away and alongside the very real risks it poses, there can huge opportunities for young people on these platforms.
“Our Social Switch Project, funded by the London Mayor’s VRU, supports young people – and those that work and live with them – to navigate the challenges of the online world and realise its potential. Building online resilience, coupled with effective regulation, can help people of all ages flourish safely online.”