https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/time-for-tech-firms-to-act-uk-online-safety-regulation-comes-into-force/ < iframe src = "//www.googletagmanager.com/ns.html?id=GTM-KNKJHR" height="0" width="0" style="display:none;visibility:hidden"> Skip to content Ofcom hamburger menu Go Back x * Topics Menu About Ofcom Ofcom is the regulator for the communications services that we use and rely on each day. Internet-based services As people communicate seamlessly online and offline, we now need to invest our efforts into making digital communications work for everyone Media use and attitudes Ofcom wants to understand how adults and children in the UK use media. Online safety Under the Online Safety Act, Ofcom's job is to make online services safer for the people who use them. We make sure companies have effective systems in place to protect users from harm. Phones and broadband Ofcom is committed to a thriving telecoms sector, where companies can compete fairly and customers benefit from a broad range of services. Post Ofcom's job is to make sure there is a universal postal service. Spectrum You can't see or feel radio spectrum, but we use it every day. Our job is to authorise and manage the use of spectrum in the UK. TV, radio and on-demand We make sure that broadcasters provide quality TV, radio and on-demand programmes that appeal to diverse audiences. We also have rules in place to protect viewers and listeners from harm. * Ofcom's work Menu Advice for businesses How to make the most of communications services as a small business. Advice for consumers How to make the most of the services you use, and deal with problems. Consultations and statements Proposals we are consulting on and decisions we've made. Enforcement How we make sure companies follow our rules, to protect customers and promote competition. Information for industry Rules, guidance and other information for the industries we regulate. Licence information If you're looking to use certain radio equipment, or broadcast on TV or radio, you'll need a licence from Ofcom. News and updates Our latest news, features, views and information about our work. Research, statistics and data Evidence we gather to inform our work as a regulator. * Complaints Menu Complaints + Complain about mobile, phone or internet services + Complain about TV, radio or on demand services + Complain about postal services + Complain about wireless interference + Complain about online services, websites or apps + Ofcom may be able to help you complain about or report issues relating to: phone, broadband and postal services; TV, radio and on-demand programmes; interference to wireless devices; or something you have seen on an online service, website or app. + The types of complaint we deal with + Get in touch by phone or post * Licences Menu Licences + Spectrum licences + Radio broadcast licences + TV broadcast licences + You need a licence from Ofcom if you're looking to use certain radio equipment (like amateur or ships radio), or broadcast on TV or radio. + Help with your spectrum licence + Help with your broadcast licence Global Search [ ] Search Search Quick links --------------------------------------------------------------------- Broadband and mobile coverage checker Social tariffs: Cheaper broadband and phone packages Telephone area codes: Find UK dialling codes for numbers starting 01 and 02 Ofcom licensing portal for aircraft, amateur and ships radio --------------------------------------------------------------------- Latest news OS News Centre HERO (1336 x 560px) (1) Time for tech firms to act: UK online safety regulation comes into force 16 December 2024 Elderly lady with a concerned look listening to a call on her home telephone What is 'line rental'? Why do I have to pay it? 13 December 2024 RM-van-Web Ofcom fines Royal Mail PS10.5m for poor delivery performance 13 December 2024 Ultrafast broadband HERO (1336 x 560px) Ultrafast broadband and landline bundles see biggest price drop in 2024 12 December 2024 Cymraeg 1. Home 2. Online safety 3. Illegal and harmful content 4. Time for tech firms to act: UK online safety regulation comes into force HELP US MAKE OFCOM'S WEBSITE BETTER! Share your experience in our 2-minute survey (opens in a new window) OS News Centre HERO (1336 x 560px) (1) Time for tech firms to act: UK online safety regulation comes into force Online safety Illegal and harmful content News and updates Published: 16 December 2024 * First codes of practice and guidance published, firing starting gun on new duties for tech firms * Providers have three months to complete illegal harms risk assessments * Ofcom sets out more than 40 safety measures for platforms to introduce from March People in the UK will be better protected from illegal harms online, as tech firms are now legally required to start taking action to tackle criminal activity on their platforms, and make them safer by design. Ofcom has today, four months ahead of the statutory deadline^[1], published its first-edition codes of practice and guidance on tackling illegal harms - such as terror, hate, fraud, child sexual abuse and assisting or encouraging suicide^[2] - under the UK's Online Safety Act. The Act places new safety duties on social media firms, search engines, messaging, gaming and dating apps, and pornography and file-sharing sites.^[3] Before we can enforce these duties, we are required to produce codes of practice and industry guidance to help firms to comply, following a period of public consultation. Bold, evidence-based regulation We have consulted carefully and widely to inform our final decisions, listening to civil society, charities and campaigners, parents and children, the tech industry, and expert bodies and law enforcement agencies, with over 200 responses submitted to our consultation. As an evidence-based regulator, every response has been carefully considered, alongside cutting-edge research and analysis, and we have strengthened some areas of the codes since our initial consultation. The result is a set of measures - many of which are not currently being used by the largest and riskiest platforms - that will significantly improve safety for all users, especially children. What regulation will deliver Today's illegal harms codes and guidance mark a major milestone in creating a safer life online, firing the starting gun on the first set of duties for tech companies. Every site and app in scope of the new laws has from today until 16 March 2025 to complete an assessment to understand the risks illegal content poses to children and adults on their platform. Subject to our codes completing the Parliamentary process by this date, from 17 March 2025, sites and apps will then need to start implementing safety measures to mitigate those risks, and our codes set out measures they can take.[4] Some of these measures apply to all sites and apps, and others to larger or riskier platforms. The most important changes we expect our codes and guidance to deliver include: * Senior accountability for safety. To ensure strict accountability, each provider should name a senior person accountable to their most senior governance body for compliance with their illegal content, reporting and complaints duties. * Better moderation, easier reporting and built-in safety tests. Tech firms will need to make sure their moderation teams are appropriately resourced and trained and are set robust performance targets, so they can remove illegal material quickly when they become aware of it, such as illegal suicide content. Reporting and complaints functions will be easier to find and use, with appropriate action taken in response. Relevant providers will also need to improve the testing of their algorithms to make illegal content harder to disseminate. * Protecting children from sexual abuse and exploitation online. While developing our codes and guidance, we heard from thousands of children and parents about their online experiences, as well as professionals who work with them. New research, published today, also highlights children's experiences of sexualised messages online^[4], as well as teenage children's views on our proposed safety measures aimed at preventing adult predators from grooming and sexually abusing children.^[5] Many young people we spoke to felt interactions with strangers, including adults or users perceived to be adults, are currently an inevitable part of being online, and they described becoming 'desensitised' to receiving sexualised messages. Taking these unique insights into account, our final measures are explicitly designed to tackle pathways to online grooming. This will mean that, by default, on platforms where users connect with each other, children's profiles and locations - as well as friends and connections - should not be visible to other users, and non-connected accounts should not be able to send them direct messages. Children should also receive information to help them make informed decisions around the risks of sharing personal information, and they should not appear in lists of people users might wish to add to their network. Onine Safety Info Graphic Our codes also expect high-risk providers to use automated tools called hash-matching and URL detection to detect child sexual abuse material (CSAM). These tools allow platforms to identify large volumes of illegal content more quickly, and are critical in disrupting offenders and preventing the spread of this seriously harmful content. In response to feedback, we have expanded the scope of our CSAM hash-matching measure to capture smaller file hosting and file storage services, which are at particularly high risk of being used to distribute CSAM. * Protecting women and girls. Women and girls are disproportionately affected by online harms. Under our measures, users will be able to block and mute others who are harassing or stalking them. Sites and apps must also take down non-consensual intimate images (or "revenge porn") when they become aware of it. Following feedback to our consultation, we have also provided specific guidance on how providers can identify and remove posts by organised criminals who are coercing women into prostitution against their will. Similarly, we have strengthened our guidance to make it easier for platforms to identify illegal intimate image abuse and cyberflashing. * Identifying fraud. Sites and apps are expected to establish a dedicated reporting channel for organisations with fraud expertise, allowing them to flag known scams to platforms in real-time so that action can be taken. In response to feedback, we have expanded the list of trusted flaggers. * Removal of terrorist accounts. It is very likely that posts generated, shared, or uploaded via accounts operated on behalf of terrorist organisations proscribed by the UK government will amount to an offence. We expect sites and apps to remove users and accounts that fall into this category to combat the spread of terrorist content. Ready to use full extent of our enforcement powers We have already been speaking to many tech firms - including some of the largest platforms as well as smaller ones - about what they do now and what they will need to do next year. While we will offer support to providers to help them to comply with these new duties, we are gearing up to take early enforcement action against any platforms that ultimately fall short. We have the power to fine companies up to PS18m or 10% of their qualifying worldwide revenue - whichever is greater - and in very serious cases we can apply for a court order to block a site in the UK. Dame Melanie Dawes, Ofcom's Chief Executive, said: For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people's safety over profits. That changes from today. The safety spotlight is now firmly on tech firms and it's time for them to act. We'll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year. Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them. This is just the beginning This first set of codes and guidance, which sets up the enforceable regime, is a firm foundation on which to build. In light of the helpful responses we received to our consultation, we are already working towards an additional consultation on further codes measures in Spring 2025. This will include proposals in the following areas: * blocking the accounts of those found to have shared CSAM; * use of AI to tackle illegal harms, including CSAM; * use of hash-matching to prevent the sharing of non-consensual intimate imagery and terrorist content; and * crisis response protocols for emergency events (such as last summer's riots). And today's codes and guidance are part of a much wider package of protections - 2025 will be a year of change, with more consultations and duties coming into force, including: * January 2025: final age assurance guidance for publishers of pornographic material, and children's access assessments; * February 2025: draft guidance on protecting women and girls; and * April 2025: additional protections for children from harmful content promoting, among other things - suicide, self-harm, eating disorders and cyberbullying. Technology Notices consultation The Act also enables Ofcom, where we decide it is necessary and proportionate, to make a provider use (or in some cases develop) a specific technology to tackle child sexual abuse or terrorism content on their sites and apps. We are consulting today on parts of the framework that will underpin this power. Any technology we require a provider to use will need to be accredited - either by Ofcom or someone appointed by us - against minimum standards of accuracy set by Government, after advice from Ofcom. We are consulting on what these standards should be, to help inform our advice to Government. We are also consulting on our draft guidance about how we propose to use this power, including the factors we would consider and the procedure we will follow. The deadline for responses is 10 March 2025. END NOTES TO EDITORS 1. UK Parliament set Ofcom a deadline of 18 months after the Online Safety Act was passed, which happened on 26 October 2023, to finalise its illegal harms and children's safety codes of practice and guidance. 2. The Online Safety Act lists over 130 'priority offences', and tech firms must assess and mitigate the risk of these occurring on their platforms. The priority offences can be split into the following categories: + Terrorism + Harassment, stalking, threats and abuse offences + Coercive and controlling behaviour + Hate offences + Intimate image abuse + Extreme pornography + Child sexual exploitation and abuse + Sexual exploitation of adults + Unlawful immigration + Human trafficking + Fraud and financial offences + Proceeds of crime + Assisting or encouraging suicide + Drugs and psychoactive substances + Weapons offences (knives, firearms, and other weapons) + Foreign interference + Animal welfare 3. Information on which types of platforms are in scope of the Act can be found here. 4. Research was conducted by Ipsos UK between June 2023 and March 2024 and consisted of: 11 in-depth interviews with children and young adults (aged 14-24) with experience of sexualised messages online; 1 interview with parents of a child that had experienced online grooming; and 9 in-depth interviews with professionals working with children and young adults who have experienced receiving these messages online. 5. We commissioned Praesidio Safeguarding to run deliberative workshops in schools with 77 children aged 13-17. Related content Global Online Safety Regulators Network sets out priorities for next three years The Global Online Safety Regulators Network has today published its first Annual Report, and its Strategic Plan for 2025-2027. Ofcom appoints Oliver Griffiths as Group Director, Online Safety Ofcom has today announced the appointment of Oliver Griffiths as its new Group Director, Online Safety. How children will be protected from accessing online pornography The Online Safety Act introduces new rules to make sure children are protected from accessing pornography. Back to top Subscribe to email updates Follow us * x-twitter * facebook * linkedin * youtube * instagram * tiktok About Ofcom * What is Ofcom? * Contact us * Nations and regions * Jobs * General Privacy Statement * Modern slavery statement * Our work in Welsh About this website * Accessibility * Cookies policy * Terms of use * Copyright and information re-use (c) Ofcom 2024