
Online safety laws tackle illegal content just as Netflix series makes TV history
Monday 14th April 2025
Adolescence has been crowned Netflix’s most-watched UK title ever through its first two weeks of streaming, with Prime Minister Sir Keir Starmer remarking ‘there’s a reason why the debate has suddenly sparked into life.’
The gritty four-parter follows the arrest of a 13-year old boy, radicalised online and suspected of murdering a female peer in an act of revenge. It addresses issues such as involuntary celibacy and the manosphere, revenge porn, and cyber bullying. It has resonated with many, as the UK has seen what lead actor Stephen Graham describes as a ‘spate of violent acts committed by teenage boys against teenage girls.’
Watching Adolescence was harrowing, upsetting and uncomfortable from start to finish, but the final episode caused particular distress for viewers, as it followed the family of the young boy and depicted the consequences they faced from the outside world as a result of the fundamental influence online algorithms had had on their son. The questions they return to repeatedly in this episode are: what did we do wrong and why couldn’t we protect him?
Adolescence poses political, ethical and legal questions when it comes to online safety and the protection of children online. From a legal perspective, the gripping portrayal of young people’s search for a sense of self in toxic online subcultures couldn’t have come at a better time, as both the UK and Europe see an influx in legislative protection for children from the dangers online. The introduction of these laws involved extensive debate on issues like what constitutes ‘harmful content’, encryption, duties on Big Tech platforms and freedom of speech.
Online Safety Act 2023
Having received Royal Assent in October 2023, the implementation of the Online Safety Act (OSA) is now in full swing, expected to be rolled out in phases by 2026.
The Act puts a range of new duties on social media giants and search services, focusing on systems and processes to reduce the risk of illegal activity and content. The Act sets out a list of ‘Priority Offences’, reflecting the most serious forms of illegal content against which companies must take proactive measures.
The strongest protections are designed for children, and platforms must now prevent children from accessing harmful and age-inappropriate content and to provide parents with clear and accessible ways to report problems online when they do arise. Children must be prevented from accessing ‘Primary Priority Content’ like pornography and images of self-harm, eating disorders and suicide and they should only be given age-appropriate access to ‘Priority Content’ like images of bullying, hateful content, and content that encourages serious violence, dangerous stunts, and harmful substance abuse.
The phased implementation, at high level, is as follows:
Phase one: The Illegal Harms Codes of Practice came into force on 17 March 2025. The Codes require platforms to:
- have structured risk assessment and mitigation processes;
- monitor and moderate high-risk features;
- take proactive measures against illegal content;
- provide user-reporting systems; and
- provide clear and accessible terms of service.
Phase two: Focuses on child safety, pornography, and the protection of women and girls. Ofcom published its Children’s Access Assessments Guidance in January. Providers are required to undertake children’s access assessments by 16 April 2025. This means platforms publishing pornographic content are required to introduce robust age checks and providers will have to risk assess their platforms against harmful content (such as images of suicide or self-harm) that are accessible to children.
Phase three: focuses on additional requirements relevant to some platforms to enhance transparency and accountability. The Threshold Conditions Regulations came into force in February, clarifying which services are caught by the regime. Ofcom expects to publish the register of categorised services by Summer 2025, and where relevant, to provide guidance for the additional duties on categorised services by early 2026.
Data (Use and Access) Bill
The Data (Use and Access) Bill also had its third reading in February and is expected to receive Royal Assent later this year. The Bill, among other things, aims to make provision for independent research into online safety matters and about the creation and solicitation of purported intimate images.
In an attempt to tie up loose ends, members of the House of Lords put down some final amendments:
- stronger protections for children’s data and more stringent data processing duties on platforms providing services likely to be accessed by children;
- making it an offence to create, or solicit another to create a purported image of an adult without their consent;
- removing the ‘reasonable excuse’ defence for a defendant who has intentionally created, or solicited another to create, an intimate image of a victim without their consent; and
- giving courts the powers to imprison defendants convicted of such offences.
This area of the Bill will have a significant impact on the consequences of sharing content online and aims to unlock the secure and effective use of data for public interest and safety.
What next?
The Netflix series illustrates how impressionable adolescents can fall victim to lured online content which appeals to their vulnerabilities and insecurities. One way to change this is through digital media literacy, another is through legislation. While the OSA has come into force, its long-term effects and Ofcom’s appetite to enforce against non-compliant platforms, and wider attitudes towards those engaging in illegal behaviour online, remain to be seen. In the meantime, the prominence given to Adolescence in the media – and in Number 10 – is hopefully a sign that as a society we are ready to address these issues and find solutions that will support the next generation.