
| This week, debates about young people and social media have flared up again in the UK, with renewed calls to ban access to platforms for under‑16s – following in the footsteps of recent legislation in Australia that sets a minimum age for social media use. While the intentions behind these proposals are understandable, a blanket ban is unlikely to work in practice and risks misunderstanding both why young people use these platforms and what drives harm online.
Experience from Australia shows just how complex enforcing age restrictions can be: reporting has highlighted tech companies have already had to deactivate hundreds of thousands of accounts, yet young people often find ways to circumvent the rules or migrate to lesser‑regulated sites. Simply excluding young people from mainstream social media won’t address the monetised, algorithm‑driven toxicity that fuels disinformation, polarisation and harm – and it may even curtail the positive aspects of digital connection and civic engagement that many young people experience. It is especially ironic that many of the same political leaders now advocating for a ban also presided over years of cuts to youth services – from clubs to community spaces and structured support – leaving young people with fewer places for in‑person connection, support and development. The decline in youth hubs and services has been profound, with funding slashed and vital services disappearing from towns and cities across the UK. The school closures and the lack of preparedness to support children and young people during the Covid 19 pandemic further fuelled the breaking of in-person relationships and reliance on digital interaction. Social media does not exist in a vacuum. The drivers of online behaviour – loneliness, lack of support, limited offline opportunities – are rooted in real‑world policy choices. A narrow focus on age‑based restrictions will do little to curb toxic content or protect young people unless it is paired with better regulation of platforms, investment in digital education and literacy, and meaningful alternatives that offer safe, enriching spaces both online and offline. In 2024, the Foundation made the decision to leave X because of its role in amplifying disinformation that fuelled the racist riots of July and August of that year – a reminder that false and harmful content thrives when regulation is weak and accountability is absent. A ban will not fix these deeper problems. Instead, we need policies that regulate content, hold platforms to account, support children’s digital engagement, and rebuild the community infrastructures that help young people thrive. Jabeer Butt OBE Chief Executive
For media enquiries please contact Lauren Golding at comms@racefound.org.uk |





