Introduced to tackle growing concerns over the safety of internet users – particularly children and vulnerable groups, the Online Safety Act (OSA) marks a significant shift in the regulatory landscape for businesses operating online platforms in the UK.
Passed in October 2023 and progressively being enforced, it has introduced a wide range of new obligations, imposing stricter requirements for transparency, age verification and content moderation to create a safer online environment.
Under the Act, businesses operating online must now ensure transparency by regularly publishing their safety measures and reporting on their efforts to regulators. This means not only creating new policies where needed, but also providing evidence that these policies effectively mitigate risks associated with harmful content. The Act places specific emphasis on platforms accessed by children, requiring additional safeguards and age-appropriate design features.
To comply with these new regulations, digital platforms will be required to implement more stringent risk mitigation policies and are mandated to collaborate with Ofcom, the UK’s communications regulator. Ofcom will oversee the implementation of the Act and enforce penalties for those not in compliance. To comply, businesses must maintain detailed compliance records by continuously updating and improving their safety measures to keep up with evolving risks.
Effective Age Verification and Safeguards for Children
One of the most critical elements of OSA is the focus on protecting children and young people as and when they access the internet. Come 2025, online platforms accessible to minors will be required to implement age checks to accurately determine whether or not users are children.
Support authors and subscribe to content
This is premium stuff. Subscribe to read the entire article.