The UK government has dramatically escalated its fight against non-consensual intimate imagery online. On 10 April 2026, ministers tabled an amendment to the Crime and Policing Bill that would make senior technology executives personally liable — including facing imprisonment — if their platforms fail to remove intimate images shared without consent.
For small business owners who operate websites, forums, community platforms or any service that accepts user-generated content, this is a wake-up call. The days of treating content moderation as someone else's problem are over.

What Has Actually Changed?
Back in February 2026, the government announced that platforms would be legally required to remove non-consensual intimate images within 48 hours of being flagged. The penalty at that stage was fines of up to 10% of global revenue or having services blocked in the UK.
Friday's amendment goes significantly further. Senior executives can now be held personally liable if their platform fails to comply with enforcement decisions from Ofcom, the UK's communications regulator. The penalty? Imprisonment, a fine, or both — unless the executive can demonstrate a reasonable excuse.
This sits alongside a broader package of measures already introduced in 2026, including making it a criminal offence to create non-consensual intimate images (even if never shared), and criminalising the supply of so-called nudification tools — AI-powered applications designed to generate fake explicit imagery of real people.
Why This Matters to Small Businesses
You might think this only applies to the likes of X, Meta or TikTok. Think again. The Online Safety Act and its associated regulations apply to any service that allows users to post content, regardless of the size of the company behind it. If you run a forum, a community group, a review site, or even a business platform with a chat feature, you have obligations.
The Key Risks
- Personal liability: If you are a senior figure in a company that operates an online platform, you could be held personally responsible for failures to act on reported content.
- 48-hour removal window: Once non-consensual intimate content is flagged to you, the clock starts ticking. You must have processes in place to respond swiftly.
- Reputational damage: Beyond the legal consequences, hosting such content — even briefly — can destroy the trust your customers place in your brand.
- Financial penalties: Fines can reach up to 10% of qualifying worldwide revenue, which for a growing SMB could be existential.
The Grok Scandal: A Cautionary Tale
This legislative push was triggered by the Grok AI scandal in late 2025 and early 2026. Elon Musk's AI chatbot, integrated into the X platform, was used to generate millions of fabricated explicit images of real women and children. The Internet Watch Foundation reported criminal imagery involving children as young as 11.
The fallout was enormous. Multiple governments condemned the practice, Ofcom launched a formal investigation, and the UK government declared the situation a national emergency. The function was eventually restricted, but not before significant harm had been done.
The lesson for smaller businesses is stark: if a company with the resources of X can get caught out, a small firm without dedicated content moderation teams is even more vulnerable.
What You Should Do Now
Whether you run a small e-commerce site with product reviews or a community platform, here are the practical steps you should take immediately:
- Review your content moderation policies. Do you have a clear, documented process for handling reports of harmful content? If not, create one today. Our resources section has templates to help.
- Implement a reporting mechanism. Users must have a straightforward way to flag harmful content. A simple contact form is not sufficient — you need a dedicated abuse reporting pathway.
- Set up response timelines. You have 48 hours from the point content is reported. Build internal escalation procedures that can meet this deadline, including out-of-hours cover.
- Train your team. Everyone who handles user content or customer complaints needs to understand what non-consensual intimate imagery is and what to do when they encounter it.
- Audit your AI tools. If you use any AI-powered features — image generation, chatbots, content creation — ensure they cannot be used to produce intimate imagery. Check with your technology providers about their safeguards.
- Document everything. In the event of an Ofcom investigation, being able to demonstrate that you acted promptly and in good faith is your best defence. Keep logs of all reports and actions taken.
The Bigger Picture: AI Regulation Is Accelerating
This amendment is part of a much wider regulatory trend. The UK government has committed to halving violence against women and girls within a decade, and online abuse is a central battleground. Alongside the intimate image provisions, the Crime and Policing Bill is also criminalising pornography depicting incest and adults roleplaying as children.
For small businesses, the direction of travel is clear: the regulatory burden around online content is only going to increase. Investing in proper content moderation and compliance infrastructure now is far cheaper than dealing with the consequences of getting it wrong later.
The burden of tackling abuse must no longer fall on victims. It must fall on perpetrators — and on the companies that enable harm.
Those words from the Prime Minister should be pinned above every small business owner's desk. Whether you consider yourself a technology company or not, if your platform hosts user content, you are now squarely in scope.
Need Help Getting Compliant?
If you are unsure whether your business is affected or what steps you need to take, do not wait for an Ofcom enforcement notice to find out. Get ahead of the regulation and protect both your users and yourself.
Protect Your Business Today
Our compliance and content moderation packages are designed for UK small businesses. Get the policies, tools and training you need to meet your Online Safety Act obligations.
View pricing plans