Catch22 is a charity and social business. We design and deliver services that build resilience and aspiration in people and communities, supporting over 160,000 people annually facing social disadvantage. We operate across the justice, education, social care, employability and skills sectors.
Catch22 supports thousands of children and young people impacted by violence and exploitation across the country every year. We deliver the County Lines Resilience and Support Service, specialist child criminal exploitation services, gangs and violence prevention programmes in prisons, embedded youth work in A&Es for young victims of violence, and our innovative Social Switch Project focusing online harms.
The Social Switch Project focuses on addressing online harms by educating professionals, parents, and trusted adults on supporting young people’s safe online engagement. Through its CPD-certified Online Harms Training, it raises awareness of harmful online behaviour and its links to real-world violence. Funded by the London Violence Reduction Unit, the project also helps young people in London develop digital skills for the workforce via its Digital Skills & Employability Programme. Since 2019, over 2,300 professionals have been trained, and 79% of young participants have secured employment or further training. The project collaborates with key partners to promote safer online environments and resilience against online risks.
There are many structural drivers of violence impacting children and young people, such as poverty, school exclusion, and racial inequity. Catch22 therefore thinks that violence impacting children and young people is preventable – not inevitable. This requires a cross-governmental approach at both national and local level, in which communities are genuinely involved.
As such, we believe that any links between social media and violence should not be understood and approached in isolation and should instead be addressed in its wider context. Preventing violence online and offline requires that not only that the structural risk factors are being addressed, but also a long-term commitment to community, trauma-informed approaches delivered by trusted adults, and services which offer holistic, tailored support such as therapy, help with jobs, help with schools, and safe, positive social media use.
To this end, Catch22 would like to make the following core recommendations to the Youth Select Committee:
1. How do young people experience violent content on social media, and to what extent does this exposure link to incidents of serious violence?
Given the complexity of the causes of violence impacting children and young people, Catch22 thinks it is important to understand social media as one of the interconnected factors important to violence prevention and early intervention. While we acknowledge the links between access and engagement with violent social media content and offline violence, these should ultimately be understood and addressed in the wider context. Based on services supporting young people at risk of violence and exploitation, we see this particularly played out around feelings of safety and child criminal exploitation.
1.1 Feeling unsafe
- Experiences of feeling at risk of violence. Many children and young people carry a knife because they feel unsafe. Recent YEF research showed that 67% of teenagers feel at risk of violence, and 5% carry a knife, primarily for a sense of protection and/or because they have been asked to do so. Children and young people with additional vulnerabilities, such as school exclusion, are more likely to possess a weapon.2 Evidence from our own Redthread services, which delivers embedded youth work for young victims of violence in A&E, indicates that 80% of the 11-25 year olds it supported in 2023/24 reported living in an area where they witness or experience regular violence.
- Social media and fear of violence. Evidence shows that many children inadvertently access violent content pushed on platforms, and one in nine have come across content featuring e.g. zombie knives.3 Young people have also reported that weapon imagery in e.g. online violence prevention campaigns by the police cause fear, especially for those already traumatised by violence, which increases the chances of them carrying weapons out of a sense of self protection.4 Indeed, children strongly recommended that the police stop using such imagery online.5
1.2 Online sales of knives and weapons – It is relatively easy for young people to buy knives online, both from UK based and overseas online retailers. We therefore welcome further regulation of this, provided it avoid the risk of increased criminalisation of children and young people. Please see also our response to question 5.
1.3 Child Criminal Exploitation (CCE) – There are strong links between CCE and online platforms, including via the use of social media and online gaming for grooming child victims, with evidence that increased time spent online is resulting in higher vulnerability of CCE.6 Indeed, 97% of Catch22’s child criminal exploitation referrals have an online or social media element. However, recommendations to include CCE as a potential hard to children in the Online Safety Act,7 have however not been reflected in the legislation.
1.4 Other forms of relevant content – In our Social Switch Project, children and young people regularly report the easy online access of (illegal) content related to increased risks of offline violence, including:
- Dis/mal-information triggering violence (e.g. Southport riots)
1.5 Acceleration – Social media can play a role in accelerating offline intrafamilial tensions and violence because of the speed in which information can be shared, and distorted, amongst peers and/or ‘rivals’.8 Increasing young people’s awareness of safe social media use can therefore play a crucial part in violence prevention.
1.6 The role of algorithms – Algorithm-driven platforms play a significant role in amplifying violent or harmful content and increase the likelihood of children accessing and engaging with such content. By promoting content that generates strong emotional reactions, often favouring sensational or disturbing material such as violence and explicit imagery, it promotes engagement.
- Effects on Young People – According to Ofcom’s 2023 report, young adults aged 18 to 24 spend an average of 4 hours and 36 minutes online daily.9 As algorithms work on recency, content type, virality, watch time, likes, shares and comments, this makes young people particularly vulnerable to algorithmic influence. During the 2020-2021 lockdowns, 70% of young people reported encountering violent or explicit material, such as videos of suicide or extreme violence.10
- Normalisation – Such algorithms increase the exposure to harmful content and thus risks further desensitising to, and normalisation of, violence contributing to unsafe behaviours.
- Addiction – Children and young people engaging in our Social Switch Project tell us that much of their interaction with social media is unhealthy but that they continue to watch negative content, which the algorithms contribute to. The Online Safety Act requires social media platforms to consider how algorithms affect harmful content exposure but not to avoid harmful addictive behaviours.
Against this background, Catch22 recommends:
- Investment in services, such as Social Switch Project, to raise awareness amongst children and young people about how algorithms work, how that will impact them, and how to create healthy boundaries around social media use.
- Avoiding the use of violence-related imagery in online violence prevention campaigns to reduce the risk of perceptions of insecurity among children and young people.
- Ensuring that the Ofcom Online Safety Act Codes of Practice include detailed procedures required from online companies and social platforms to address child criminal exploitation.
Best practice example: Violence prevention, employability and digital resilience
Catch22’s Social Switch Project started from the belief that to truly have an impact on levels of violence affecting children and young people, we must help to tackle its causes, such as the lack of positive alternatives and access to meaningful careers.
The Social Switch therefore delivers a Digital Skills & Employability Programme, free for 16-30 year-olds living in London. Funded by the London Violence Reduction Unit, and by working together with VCS and corporate partners, the programme aims to divert young people impacted by, or at risk of, violence towards fulfilling careers. Young people access extensive in-and-post programme support, tailored to each individual, with positive outcomes around employment, apprenticeships, and further training.
As part of the programme, Social Switch educates young people on e.g. safe online behaviours and algorithmic impact and supports the development of their digital literacy and resilience.
In 2024, over 1,000 young people signed up for the Social Switch Digital Skills & Employability Programme, gaining essential skills in coding, digital marketing, and career development.
For more information visit our website.
2. How do social media companies ensure that young people had age-appropriate and safe experiences on their platforms?
Social media companies have implemented various policies and tools aimed at ensuring that young users have age-appropriate and safe experiences on their platforms. These often include age restrictions, privacy settings, content moderation systems, and reporting mechanisms designed to flag harmful content. However, the effectiveness of these measures is inconsistent across platforms.
2.1 Challenges in enforcement – Social media companies face significant challenges in enforcing age restriction policies and content reporting mechanisms. While many platforms have age restrictions, the lack of strong verification systems often allows vulnerable young people to create accounts, including under-13s. Alongside the implementation of the Online Safety Act, there remains a real need for better verification technologies,11 as well as stricter repercussions for guideline breaches.
2.2 Challenges in reporting harmful content – Just one in six young people flag harmful content online, often citing unclear processes, past inaction, or fear of repercussions as barriers.12 While some companies claim to remove 99% of harmful content before reaching 10,000 views, there remains scope for them to be more proactive.13 Furthermore, our Social Switch Project, Catch22 also sees that harmful pages can be removed and accounts revoked when reported, but that there tends to be a lack of follow up with the users creating that content. Online companies and platforms should be made further accountable for ensuring such users do not get other opportunities to place similar harmful content elsewhere via other accounts.
3. How effective is the Online Safety Act 2023 in protecting young people from harmful content that promotes violence on social media
4. To what extent does Ofcom, the communications regulator, have adequate powers and resources to regulate social media platforms, particularly with regard to young people?
Catch22 has welcomed the Online Safety Act 2023 and its stricter accountability for technology companies, the regulation of harmful content and algorithms, and intention to better protect children. We see the the Act as a significant step towards addressing harmful content, including violence-related content, on social media, but have real concerns about e.g. the pace of implementation, the lack of transparency, and the emphasis on self-assessment. Catch22 would therefore underline the need for timely and robust reviews of the effectiveness of both the Online Safety Act and the Ofcom powers.
3.1 / 4.1 Pace of implementation – While we appreciate the care taken by consulting with stakeholders around the draft guidance and codes of conduct, and any delays due to the general elections, we are disappointed with the pace in which Ofcom is developing the guidance and codes of practice to implement that changes set out in the Act, especially those around illegal content and protecting children from harmful content, including algorithms. We however appreciate that this work has now progressed and welcome the requirement for platforms to have assessed risks of illegal content by March 2025. We were however disappointed to note that the Protection of Children Codes and children’s risk assessment guidance will now not be published until April 2025.14
3.2 / 4.2 Gaps in regulation – We are however concerned about the gaps in guidance and codes of conduct the guidance, particularly in relation to:
- A lack of robust measures for illegal content to be taken offline
- The scope for strengthening the age assurance technology.
- The codes of guidance depending quite heavily on self-assessment by online platforms and tech companies, with insufficient details around transparency, auditing and enforcement.
- The lack of attention in the codes of practice to how social media is being used for child criminal exploitation, and how this should be addressed and prevented.
3.3/4.3 – Review of effectiveness – Catch22 would like to see an early review of the impact of the implementation of the Online Safety Act and the codes of guidance. This should include independent auditing of online company compliance. Where necessary, codes of conduct and Ofcom powers should be strengthened to improve impact, compliance and, ultimately, the safety of children and young people.
3.4/4.4 – Increasing collaboration: Catch22 thinks there is scope for increased and better collaboration between the VCS supporting children and young people, government, Ofcom and online companies to continuously improve safety practices, prevent regulatory gaps, and ensure that children and young people are properly consulted.
5. How effective is current UK Government policy more widely, in protecting young people from harmful content and preventing the promotion of serious youth violence?
Current UK Government policy, particularly through frameworks such as the Online Safety Act, demonstrates a commitment to protecting young people from harmful content and preventing the promotion of serious youth violence. In our responses to the previous questions, we have already indicated why, and how, Catch22 thinks this could be enhanced in terms of online platform and harmful content regulation. However, this type of regulation alone will not be sufficient, and Catch22 thinks there are further ways in which policy could be improved to keep children and young people safer from the risk of violence.
5.1 Improvement of government policy – However, regulation alone will not be sufficient, and further policy development and investment will be required. Catch22 would particularly like to highlight the need for:
- Investment in digital literacy of young people – Please see also our response to question 6. In particular, Violence Reduction Units should commission violence prevention programmes which aim to increase digital resilience.
- Collaborative prevention strategies of online harm – The Government could do more to foster partnerships and collaboration between online companies and social media platforms, VCS, and statutory agencies to create cohesive prevention strategies and violence diversion programmes.
5.2 Preventing online sales and promotion of weapons
- Online sales – It is relatively easy for young people to buy knives online, both from UK based and overseas online retailers. We know from our county lines, child criminal exploitation and violence prevention services that young people buy or rent weapons online via apps like Snapchat or Telegram or sales platforms such as Temu.
- Regulation – Catch22 therefore welcomes further regulation of online sales, such as in recent and pending legislation on zombie knives and ninja swords, and are looking forward also to the recommendations of the National Police Chiefs’ Council consultation, on behalf of the Home Office, on social media and knife crime.
- Enhance Artificial Intelligence tools – Catch22 think there is scope for more effective use of algorithm development to flag weapon-related keywords, images, and accounts engaging in illegal activities. Ofcom has highlighted that current AI tools are insufficient for detecting all violent content,15 so more should be focused on improving the use of AI and machine learning to reduce young users’ exposure to harmful content.
- Awareness of regulation – Against stricter measures for online providers and platforms, any such changes will also need to be ensure that children and young people become sufficiently aware of the legal risks of buying such weapons online. We think that online platforms have an important role in education young social media users of such risk and legal consequences.
Catch22 therefore recommends:
- Stronger regulation and enforcement requiring social media platforms to detect and swiftly remove listings for illegal weapons, and to mandate the reporting of such content to law enforcement agencies.
- Collaboration between social media platforms, VCS, and relevant statutory bodies to prevent violence, and in running campaigns to raise awareness amongst children and young people about the risks and legal consequences of engaging with weapon-related content.
6. What role do parents, schools, social media companies, and young people themselves play in preventing the youth being exposed to violent content on social media? What barriers do they face to addressing this issue?
Parents, schools, social media companies, and professionals working with children and young people each have a role in tackling harmful content online and ensuring the online safety of children and young people. However, our learning from our exploitation, violence and online harm services shows that each of these cohorts are facing significant barriers in doing so.
6.1 Parents – 52% of parents feel overwhelmed by online complexities, and 34% lack awareness of safety tools.16 Consequently, they tend to struggle with the use of parental controls as well as in supporting their children around e.g. boundary setting, online risk awareness.
6.2 Schools – Schools can provide digital literacy education to teach children and young people about online safety, and how to recognise and report harmful content. However, only 42% of teachers feel equipped to address online safety, given also limited resources.17 Moreover, Catch22 advocates for better awareness of violence and exploitation risk factors for children and young people, and how to offer appropriate support and safeguarding.
6.3 Other professionals/trusted adults – To support safeguarding and violence prevention, Catch22 thinks that other professionals which children and young people come in contact with should also become better aware of online harms and how to have supportive conversations with young people about this.
6.4 Children and young people – Children and young people can do much to protect themselves online. With the right education, they can learn to identify harmful content and make responsible choices about their online activities. While they must be equipped with those skills, children should not be seen as having responsibility as such. Instead, parents, schools, social media companies, the government and Ofcom all have a role in creating safer online space for, and use by, children and young people.
6.5 Improving education and awareness of online harms – From our services and the children and young people we support, we have learned that there is enormous potential to prevent online harm strengthening awareness and education, not only for parents and children, but also for other professionals.
- Support for parents – Our extensive experience shows that parents becoming more controlling about their children’s social media or phone use is less effective than for them to learn how to develop relationships and knowledge to have positive conversations about safe social media use. Parental awareness can be improved by providing more accessible and targeted resources to help them understand the risks young people face online, particularly the dangers of exposure to violent content. Resources from organisations like the UK Safer Internet Centre can offer free materials to help parents and carers understand the online safety issues facing their children and provide guidance on how to manage their children’s digital lives. Programmes like Catch22’s Social Switch Project are valuable in educating parents about how social media platforms work, how algorithms amplify harmful content, and how to set appropriate boundaries for their children.
Best practice example – Social Switch Online Harms Training
The Social Switch Project delivered by Catch22 offers online and in-person Online Harms Training to those closest to children and young people: parents, trusted adults and professionals young people come in contact with.
The training is CPD-certified and focuses on helping participants to better understand young people’s usage of social media, equipping them with the tools to manage digital lives, and spot early signs of harm.
Funded by the London Violence Reduction Unit, Social Switch has trained over 2,300 professionals working with children and young people, including the police, health sector, and youth workers.
- 98% of participants reported that the training increased their understanding of young people’s use of social media
- 94% said it improved their safeguarding practices.
For more information, please visit our website.
- Support for children and young people – For children and young people there is a strong need to focus on digital literacy, which helps them to understand not only how to use social media safely but also how to recognise, avoid and report harmful content. Education on how algorithms work and how to influence the content, including violent material, is also essential. For this to be achieved, parental support only is not sufficient. Instead Catch22 would recommend:
- Training of professionals, teachers, and trusted adults
- Better online safety and digital resilience education at schools
- Investment in and roll out of specialist VSC provider programmes which teach children and young people how to manage their digital presence.
By improving digital resilience and self-regulation skills, young people become empowered to make safer choices online and to protect themselves from harmful exposure.
At local and national level, the government should invest in the programmes that enhance digital literacy for parents, professionals, and young people – starting with its Young Futures programme that aims to tackle knife crime.
Further information
For further information, please contact: Marike van Harskamp, Head of Policy and Public Affairs marike.vanharskamp@catch-22.org.uk