Key contacts
A couple of weeks ago Ofcom issued a fine of over £1m under the Online Safety Act 2023 (“OSA”) – the biggest penalty issued under the Act since it came into force. The comes at an opportune time, as we look back at the development of online safety rules through the course of 2025. 2025 was the year the OSA moved from statute to sustained regulatory action. Key legal triggers came into force, Ofcom issued the first codes and guidance for illegal harms and children’s safety, and the regulator rapidly shifted to enforcement – opening multiple investigations and issuing fines under its new powers. In this article, we summarise the principal milestones for the OSA which affect online platforms in the UK, particularly those which host user-generated content. We also explain some of the key developments relating to UK online safety which are on the horizon for 2026.
Looking Back…
26th October 2023 – The OSA Receives Royal Assent
The OSA increases protection for users of online platforms in the UK (particularly social media platforms, gaming platforms and other websites and apps that host user-generated content) and gives Ofcom the power to enforce these rules. The Act received Royal Assent on 26th October 2023 following various consultations and amendments. However, the actual roll out of the OSA has been gradual, and most of the provisions didn’t take effect until this year.
Implementation of the OSA requires Ofcom to develop guidance and codes of practice setting out how online platforms must meet their duties under the OSA. On that basis, Ofcom set out a phased roadmap to implementation as follows:
- Phase 1 – Development of Illegal Content Duties for In-Scope Services
- Phase 2 – Development of Harmful Content to Children Duties for In-Scope Services
- Phase 3 – Development of Additional Requirements for Certain Categorised Services
More information on the OSA and implementation phases are explained in our article here.
Important to note is that the OSA rules don’t just apply to UK companies; they also apply to companies with links to the UK (meaning platforms with a “significant number” of UK users or platforms with UK users as a target market/audience). So the OSA (and this article) is international in scope.
17th March 2025 – Phase 1 Roll Out of the Online Safety Act
17th March 2025 was a major milestone for the OSA, and the date when Phase 1 of the OSA roll-out roadmap was completed. From this date, online platforms became legally required to protect their users from illegal harm, and by this date they should have undertaken a risk assessment of their platforms for any illegal content* (the deadline for which was 16th March).
*Illegal content is defined in the OSA as “content that amounts to a relevant offence”, with “relevant offences” including offences such as terrorism, harassment, CSEA and fraud.
In addition, Ofcom’s illegal content codes of practice (which require providers to take the safety measures detailed therein or use other effective measures to protect users from illegal content and activity) came into force. There is a code specifically for user-to-user platforms and a separate code for platforms offering search services. The codes are accessible here and our separate (more detailed) article on these is here.
In terms of practical enforcement, since this date Ofcom has had the power to impose penalties of up to £18m or 10% of qualifying worldwide revenue (whichever is greater) against platforms that fail to protect their users from illegal harms. Ofcom also has the ability (in very serious cases) to obtain a court order for “business interruption measures”, such as requiring payment providers or advertisers to withdraw their services or requiring ISPs to block a provider’s services in the UK.
25th July 2025 – Phase 2 Roll Out of the Online Safety Act
Phase 2 was also completed this year, with the first protection of children code of practice coming into force on 25th July. From this date, platforms are legally required to comply with the children’s safety duties under the OSA. This means implementing the safety measures set out in the codes or other equally effective measures to protect children from content that is harmful to them.
By this date, in-scope providers should have undertaken an assessment of the likelihood of their service being accessed by children and, where applicable, a risk assessment for assessing the risk of their platforms containing content that is harmful to children (the deadline for this was 24th July 2025). They now need to ensure that services posing risks provide appropriate barriers to such content. To that end, a key introduction of the code is the requirement to provide “highly effective age assurance” on platforms which do not prohibit one or more kinds of “primary priority content” (defined under Section 61 of the OSA as content that “encourages, promotes or provides instruction for” acts such as suicide, injury or self-harm, eating disorders, and content that is pornographic). Such tools generally take the form of facial age estimation or ID verification checks.
In addition, services likely to be accessed by children are now required to put in place measures to protect children from encountering content that is abusive or incites hatred, bullying content, violent content, and content which encourages, promotes, or provides instructions for dangerous stunts and challenges, and self-administering harmful substances.
In-scope service providers are also now required to apply appropriate content moderation to protect children from harm through platform design and put in place complaints and reporting systems for prohibited content.
The roll out of Phase 2 pushes the scope of the OSA beyond the illegal harms targeted within Phase 1, ensuring that in-scope service providers have more robust protections in respect of children – who will be protected not only from illegal content but also legal but harmful content, such as content classified as violent, bullying, or misogynistic. This aligns with original intentions of the OSA, and stems from various campaigns from parents and charities concerned about children’s access to such content in this new online era.
8th October 2025 – QWR Regulations Introduced
One big question which arose when the concept of the OSA was first floated was: how will the UK pay for the regulator? This very question was addressed head-on in the OSA, which requires that fees (“Industry Fees”) be imposed on certain providers of certain in-scope services to cover Ofcom’s operating costs for the online safety regime. The providers who need to pay the fees are those who have Qualifying Worldwide Revenue (“QWR”) above a certain threshold (as set by the Secretary of State, following advice from Ofcom).
QWR is also relevant to the penalties and maximum fines which Ofcom can issue under OSA, so the relevant QWR thresholds have been much anticipated and contribute to the real-life effects of OSA enforcement.
On 8th October, the Online Safety Act 2023 (Qualifying Worldwide Revenue) Regulations 2025 was introduced to clarify how QWR would be determined.
25th November 2025 – Ofcom Issues Guidance on Online Safety for Women and Girls
As part of the Phase 2 roll out, further guidance was recently issued by Ofcom on how service providers can create safer online experiences for women and girls – with requirements for in-scope service providers to take on more responsibility to improve the safety of, and prevent harm to, women and girls online. These heightened protections for women and girls seek to tackle gender-based harms present in the digital world, which disproportionately impact women and girls. The Ofcom guidance sets out nine areas where in-scope providers can improve the online safety of women and girls.
4th December 2025 – Ofcom Issues Fine for Failing to Have Robust Age Checks under OSA
Earlier this month Ofcom issued a fine of £1m to AVS Group Ltd for not having robust age checks on their various adult websites in line with OSA requirements, with a further £50k fine for failing to respond to Ofcom’s legally binding information requests. Critics argue that the value of the fine is still too low in the context of the breach, while others criticise financial penalties as being useless on the basis that reportedly no OSA fine has ever been paid to date. For example, a recent fine imposed on online forum 4Chan has reportedly still not been paid (and is therefore racking up additional daily penalties). Although it is worth noting that, regardless of any fines or Ofcom penalties, the new OSA rules are already reportedly impacting consumer behaviour.
These recent developments provide some insight into how the OSA rules will be enforced in practice, and how effective they are. And we should hopefully learn more about the effectiveness of the OSA’s age assurance rules once Ofcom issues its Age Assurance Statutory Report (expected next summer). The recent fine also provides a stark reminder of the cost of non-compliance – even at this early stage of OSA implementation.
Looking Forward…
February 2026 – Super-Complaints Regime
We expect to see the Ofcom super-complaints regime come into force early next year, which will enable eligible entities to raise systemic issues relating to online safety to Ofcom’s attention. Ofcom is currently working on the final guidance for this regime, which it expects will be published in February 2026. The consumer industry is no stranger to super-complaints regimes, and the CMA and FCA both have similar regimes in place currently. Introducing such a regime in the online safety context could potentially accelerate regulatory action on pervasive harms.
Spring 2026 – Ofcom’s Release of Media Literacy Recommendations
We also expect Ofcom to issue their recommendations to in-scope providers regarding how they can promote media literacy next year. The recommendations are to be based upon their consultation earlier this year.
Summer 2026 – Implementation of OSA Phase 3
While Ofcom has issued some guidance and consultations to progress roll out of Phase 3, the additional requirements and guidance which will apply to the largest service providers (including the final Register of Categorised Services and corresponding duties) are not due until Summer 2026 at the earliest.
The roll-out of the Register of Categorised Services will place providers into categories (Categories 1, 2A, and 2B) relating to the level of risk their platforms pose to users. Category 1 providers will have additional, more demanding duties, including providing user empowerment features and user identification verification. Providers who fall within Categories 2a, and 2b will only be required to comply with a sub-set of these duties. All categorised services will be subject to transparency reporting and disclosure requirements about the use of their service by a deceased child user.
The distinction between categories of platforms in Phase 3 will increase duties for many service providers, and in-scope providers should be prepared for this. Because of the impact that categorisation can have on providers, the Register of Categorised Services has not been without controversy. It was originally expected to be released in Summer 2025, however was delayed as a result of the first judicial review proceedings related to the OSA – namely the Wikimedia Foundation case. Wikimedia claimed that the proposed categorisation rules were flawed, and incorrectly classified Wikimedia as a Category 1 service, meaning it would need to comply with the most stringent rules (including verifying the identity of its contributors) which it argued was not appropriate for the type of service it provides and would undermine the privacy and security of its contributors. The court ruled against Wikimedia in August; however it was noted in the judgement that although Wikimedia failed to prove the legal flaws in the categorisation process, the new rules do not give Ofcom and the government the “green light to implement a regime that significantly impedes” the operation of the relevant service. Whilst this was the first judicial review of the OSA, we very much doubt it will be the last.
Autumn 2026 and Beyond – More Reports, Guidance, and Consultation
Given online safety is such a politicised and fast-paced area, we expect to see much more guidance, reports, and consultations moving forward. Ofcom has made clear within their implementation timeline that we should expect further developments in the second half of next year – including Ofcom’s statutory report on Content Harmful to Children.
Conclusions
Against the backdrop of the strengthened consumer protection law more generally (see our previous round-up article here), together with the OSA’s increased online protections, 2025 marked a clear shift towards a more stringent consumer protection regime in the UK. Beyond the scope of the OSA, there is an overall push towards more protections for the public online. The UK’s recent implementation of the DMCC, the DUAA, and the progress on the implementation of Cyber Security and Resilience Bill highlight the growing strategy from policymakers to ensure that individuals are safe from both online content, and the infrastructure that hosts it.
Ofcom has not been slow to start using its new online safety enforcement powers. Despite the OSA only really coming into effect in 2025, as of October 2025 it had already launched 5 enforcement programmes and opened 21 investigations. Its most recent fines were for a failure to comply with OSA requirements regarding age checks and for failing to respond to Ofcom’s legally binding information requests; demonstrating that Ofcom will use its powers for procedural breaches (which carry with them the same level of financial risk) as well as substantive safety failures. We can expect its enforcement actions to intensify throughout 2026. While 2025 has only seen enforcement in relation to egregious breaches, the net is likely to be cast wider in 2026.
Our recommendation to in-scope businesses for 2026 is to proactively audit their online safety compliance, while keeping in mind evolving and extensive Ofcom guidance. For those businesses that offer a search engine or enable UGC on their apps/websites, they should ensure they have completed comprehensive risk assessments, have implemented mitigation measures to address the relevant risks and monitor for further guidance and developments from Ofcom. In-scope services – even if not categorised – should be ready for risk assessment audits and requests for information from Ofcom.
Article drafted with contributions from our trainee Derry Moore.