UK Enforces Online Safety Act, Introducing Tough Regulations for Tech Giants

Australia Plans Legislation to Ban Children from Social Media Use, ban for Under-16s, Online Safety Act

Key Points

  • The U.K.’s Online Safety Act officially took effect, imposing strict regulations on tech companies.
  • Ofcom set a March 2025 deadline for platforms to complete risk assessments and implement safety measures.
  • Violators face fines of up to 10% of global revenue, service blocks, and potential jail time for executives.
  • Additional updates, including AI tools to combat harmful content, will follow in 2025. The act aims to align online safety with existing offline legal protections.

The United Kingdom officially enforced its comprehensive Online Safety Act on Monday, introducing rigorous measures to tackle harmful online content and imposing substantial penalties on tech companies such as Meta, Google, and TikTok. Ofcom, the nation’s media and telecommunications regulator, published its inaugural codes of practice and guidance, specifying steps platforms must take to address illegal activities, including terrorism, hate speech, fraud, and child sexual abuse.

The law mandates a “duty of care” for tech firms, holding them responsible for preventing and managing harmful content shared on their platforms. While the legislation was passed in October 2023, the enforcement of safety duties only began this week. Ofcom has given platforms until March 16, 2025, to complete risk assessments and implement necessary changes, such as improving content moderation, enabling user-friendly reporting tools, and incorporating built-in safety measures.

Melanie Dawes, Ofcom’s Chief Executive, emphasized the need for platforms to meet the strict safety standards outlined in the initial codes and hinted that further requirements will be introduced next year. Companies failing to comply face severe consequences, including fines of up to 10% of their global annual revenue. Repeat offenders may see their services blocked in the U.K. or lose access to payment and advertising systems. Individual senior executives could even face jail time for persistent violations.

The act also requires high-risk platforms to deploy hash-matching technology to detect and remove child sexual abuse material (CSAM). This system uses digital fingerprints linked to known CSAM from police databases, enabling automated filtering to identify and remove harmful content effectively.

Future updates to the law are expected, including provisions for artificial intelligence to combat illegal content and measures to block accounts responsible for sharing CSAM. British Technology Minister Peter Kyle stated the act bridges the gap between offline and online safety, urging platforms to take proactive measures. He assured the public that Ofcom would use its full authority to ensure compliance, including issuing fines and seeking court orders to block non-compliant sites.

EDITORIAL TEAM
EDITORIAL TEAM
TechGolly editorial team led by Al Mahmud Al Mamun. He worked as an Editor-in-Chief at a world-leading professional research Magazine. Rasel Hossain and Enamul Kabir are supporting as Managing Editor. Our team is intercorporate with technologists, researchers, and technology writers. We have substantial knowledge and background in Information Technology (IT), Artificial Intelligence (AI), and Embedded Technology.

Read More

We are highly passionate and dedicated to delivering our readers the latest information and insights into technology innovation and trends. Our mission is to help understand industry professionals and enthusiasts about the complexities of technology and the latest advancements.

Visits Count

Last month: 99286
This month: 5873 🟢Running

Company

Contact Us

Follow Us

TECHNOLOGY ARTICLES

SERVICES

COMPANY

CONTACT US

FOLLOW US