Someone Shared My Photo Without Permission — What Can I Do?
If This Just Happened to You, Start Here
Having a photo of you shared online without your permission is frightening, humiliating, and urgent. Before anything else — take a breath, and know this: you have real legal rights, there are organisations specifically set up to help you, and the UK law on this has never been stronger than it is right now in 2026.
This guide tells you exactly what those rights are, who to contact, and what steps to take — in order, starting with the fastest actions first.
If the image is intimate or sexual in nature, jump straight to the Revenge Porn Helpline: 0345 6000 459, available Monday to Friday 10am–4pm. They remove this type of content from the internet free of charge and have a roughly 90% success rate.
For every other situation — non-intimate photos shared without consent, images used without permission, photos taken secretly — this guide covers your options fully.
What UK Law Says About Sharing Photos Without Permission in 2026

The legal landscape in the UK has changed dramatically in the past two years. The legal landscape for image-based abuse has been completely transformed by the Online Safety Act 2023, giving you stronger protections than ever before.
Here is what the law currently says, clearly.
Intimate Images — Criminal Offences
As of January 2024, the law created several distinct offences, now in full force in 2025 and 2026: a base offence of sharing intimate images without consent carrying a maximum six months imprisonment; sharing intimate photos with the intent to cause alarm, distress or humiliation carrying up to two years imprisonment; and sharing for sexual gratification also carrying up to two years imprisonment.
Under the Online Safety Act 2023 and the Sexual Offences Act 2003, it is a criminal offence to share, or even threaten to share, intimate photos or films of another person without their permission.
What counts as “intimate” under the law? Photos or films are intimate and covered by the law when they show a person doing or present during a private sexual act, or show genitals, buttocks or breasts — exposed or covered only with underwear — or are digitally enhanced images such as those that are Photoshopped.
A critical update from February 2026: As of 6 February 2026, the Data (Use and Access) Act 2025 now targets the source of the harm. Section 138 of the act creates offences for both creating and requesting the creation of purported intimate images of adults without their consent. This means AI-generated deepfakes of you are now explicitly criminalised — not just the sharing of real photographs.
Since 31st January 2024, the law has made it illegal to share AI-generated intimate images without consent — deepfakes.
The New 48-Hour Takedown Rule
On 19 February 2026, the government announced an amendment to the Crime and Policing Bill, currently progressing through Parliament, that will require regulated service providers to take down non-consensual intimate images within 48 hours of being notified of them.
The creation or sharing of non-consensual intimate images will be designated a priority offence under the Online Safety Act, placing it in the same high-harm category as offences such as terrorism and child sexual exploitation.
This is a major shift. Once this amendment passes — which is expected shortly — platforms that fail to remove NCII within 48 hours of notification face fines of up to 10% of their worldwide annual turnover.
Non-Intimate Photos
For photos that are not intimate but were shared without your permission, the legal position depends on what was shared and how.
If sharing the photo is seen as harassment or an invasion of privacy, it could be illegal under UK laws such as the Protection from Harassment Act 1997 or the Human Rights Act 1998, which protect the right to respect for private and family life.
Copyright in a photograph usually belongs to the person who took the picture, but unauthorised use, especially for commercial gain, can trigger claims for misuse of image rights or breach of data protection.
If someone is using a photo of you commercially — in advertising, on a business website, to sell products — without your consent, you may have additional claims under data protection law and privacy rights, separate from and beyond copyright.
Step-by-Step: What to Do Right Now
Work through these steps in order. Speed matters — the faster the content is reported, the higher the chance of removal.
Step 1 — Document Everything Before Reporting
Before you contact anyone, take screenshots of the image in its current location. Include the URL in the screenshot so the address is visible. Note the date and time. Save these somewhere safe — not just on the platform where the image appears.
This sounds counterintuitive when your instinct is to get the image down immediately. But without documentation, you have no evidence. Platforms sometimes remove content before you can show police what was shared. Screenshots preserve the record.
Step 2 — Report Directly to the Platform
Every regulated UK platform is now required under the Online Safety Act to have a reporting mechanism and to act on reports of illegal content. Use it.
On most platforms:
- Facebook and Instagram: Go to the image → tap the three dots → Report → It’s inappropriate → Nudity or sexual activity, or choose the relevant category
- X (Twitter): Click the three dots on the post → Report → Choose the relevant violation category
- Reddit: Click Report below the post → Community rules or Reddit rules → Choose the relevant category
- TikTok: Hold down the video → Report → choose the relevant category
- Google: If the image appears in Google Search results, use Google’s Remove outdated content tool or the Reverse image search removal request form
Under the Online Safety Act, once made a priority offence, regulated online platforms must remove such content from their services when they become aware of it and take steps to prevent it from appearing in the first place. If the platform does not act promptly, they are now legally exposed. You can escalate to Ofcom if the platform fails to respond.
Step 3 — Contact the Revenge Porn Helpline (for intimate images)
Phone: 0345 6000 459 · Monday to Friday, 10am–4pm Online: revengepornhelpline.org.uk
The Revenge Porn Helpline is a free, specialist service that works directly with platforms to remove non-consensual intimate images. The Revenge Porn Helpline achieves approximately 90% success in content removal. They understand the reporting systems of every major platform and have direct escalation routes that the general public does not have access to.
They handle all types of non-consensual intimate images — not just “revenge porn” from ex-partners. If your photo appeared on ChatPic, a ChatPic mirror site, or any other platform, and it is intimate in nature, they can help.
Step 4 — Report to the Internet Watch Foundation (IWF)
Website: iwf.org.uk/report
The IWF is the UK’s leading organisation for removing child sexual abuse material and non-consensual intimate images from the internet. Officials have signalled that non-consensual intimate images will be treated with severity comparable to child sexual abuse and terrorism content.
The IWF can act on content hosted anywhere in the world — including overseas sites outside UK jurisdiction. They work directly with hosting companies and internet service providers to remove content and block access to it from UK networks.
If the content involves someone under 18 in any way, report to the IWF immediately and also contact the NSPCC on 0808 800 5000.
Step 5 — Report to the Police
It is a crime to show intimate images or videos, send them to another person, upload them to a website, or threaten to do this, without your consent.
Call 101 (non-emergency police line) or visit your local police station to make a report. Bring your documented screenshots. Ask to speak with an officer who has digital evidence training — many survivors describe police failing to gather digital evidence or properly implement the law effectively, so it helps to be specific about what evidence you have and what offence has been committed.
Specifically, tell the officer this is a matter under:
- The Online Safety Act 2023, sharing intimate images without consent
- Or, if deepfakes are involved, the Data (Use and Access) Act 2025, Section 138
You can also report online via Action Fraud at actionfraud.police.uk if the image was shared as part of a broader pattern of harassment, fraud, or extortion.
Step 6 — Send a Formal Takedown Request
If someone has posted a photo of you that you do not want to be shared, you can ask the poster to take it down. If they do not agree to take it down, you could think about speaking to a legal adviser.
A formal written takedown request — sent by email with read receipt, or by recorded letter — creates a paper trail. It demonstrates that the person sharing the image was formally notified and chose to continue. This strengthens any subsequent legal claim.
The request should state: who you are, which image is being shared, why sharing it without your consent is unlawful, and a specific deadline for removal (48 hours is reasonable given the new legal framework).
Step 7 — Contact a Solicitor if the Steps Above Fail
Victims may be able to demand the removal of unauthorised images via takedown requests and pursue compensation for emotional or reputational harm caused.
If the platform fails to remove the content, if the person sharing it refuses to stop, or if you have suffered identifiable harm — financial, professional, or psychological — a solicitor specialising in privacy and image rights can pursue civil action. You may be eligible for legal aid depending on your circumstances. The Law Society’s Find a Solicitor tool at lawsociety.org.uk is the best starting point for finding a qualified specialist.
Specific Situations — What Applies to You
If Your Photo Was on ChatPic or a ChatPic Mirror Site
ChatPic.org is now offline, which means the image is no longer accessible through that platform. If the image was intimate and appeared on the original ChatPic platform before its 2023 closure, the hosting is now gone — but copies may exist elsewhere if other users downloaded it.
If a copy appears on a ChatPic mirror site currently operating in 2026, report it to:
- The Revenge Porn Helpline is intimate
- The IWF, if it involves anyone under 18
- The Internet Service Provider hosting the mirror site — identified using a WHOIS lookup
- Action Fraud if the mirror site is also engaged in harassment or extortion
For non-intimate photos that appeared on ChatPic and have since been downloaded and re-shared by other users, the Privacy policy route and formal takedown requests apply.
If the Photo Is Being Used Commercially Without Your Permission
In a commercial context, the use of a person’s image extends beyond mere representation. The absence of consent can lead to various legal issues, including claims for infringement of privacy rights or misuse of personal data under the Data Protection Act and the General Data Protection Regulation.
Contact the business using your image directly first. Then contact the ICO (Information Commissioner’s Office) at ico.org.uk if the use involves your personal data. The ICO can investigate and impose fines — this route does not require you to fund legal action yourself.
If the Photo Was Taken in a Private Location Without Your Knowledge
Private spaces — including homes, private offices, or medical facilities — offer greater legal protections. Taking photos without consent in these environments may constitute an unlawful intrusion and could give rise to civil action.
This may also constitute a criminal offence under the Voyeurism (Offences) Act 2019, which covers upskirting and covert photography in private settings. Report to the police immediately.
If You Are Under 18
Any intimate image involving a person under 18 is illegal to create, possess, or share, regardless of consent. Report to the IWF immediately at iwf.org.uk/report, to the NSPCC on 0808 800 5000, and to the police. You can also speak to Childline on 0800 1111.
What If the Site Is Overseas?
This is where many people give up — and where they should not.
The Online Safety Act applies to any service with significant UK users, regardless of where the company is based. The Act applies to services even if the companies providing them are outside the UK, if they have links to the UK. This includes if the service has a significant number of UK users, if the UK is a target market, or it is capable of being accessed by UK users and there is a material risk of significant harm to such users.
Failure to comply with the Online Safety Act is subject to the OSA penalty regime — up to 10% of worldwide turnover and a potential block on the service. Ofcom has the power to direct UK internet service providers to block access to non-compliant overseas platforms.
For content on platforms clearly outside any regulatory reach, the IWF can still pursue removal through its international partner network.
The Honest Reality About Enforcement
It would be dishonest to suggest the system works perfectly. Despite stronger legal protections, charging rates remain pitifully low in such cases. This is primarily because, as a relatively new and complex crime, police are still not properly investigating it or collecting sufficient evidence.
With only 4% of reported cases currently resulting in a charge, many investigations are dropped due to insufficient digital evidence or procedural errors.
This is the honest picture. The law has improved dramatically. Enforcement has not kept pace with the law yet. The practical implication is that your fastest and most reliable route to removal is the Revenge Porn Helpline and direct platform reporting — not waiting for criminal prosecution.
The legal route through police and courts matters for holding perpetrators accountable and for pursuing compensation. But if your immediate priority is getting the image down, the specialist organisations above are faster and more effective than the formal legal process in most cases.
Frequently Asked Questions
Is it illegal for someone to share a photo of me without my permission in the UK?
It depends on the photo. Sharing intimate images or videos without the consent of the individuals depicted crosses legal boundaries and is a criminal offence. For non-intimate photos, the legal position depends on context — harassment, commercial misuse, and privacy invasion each have different legal routes. If you are unsure, contact Citizens Advice or a solicitor.
Does the photo have to be sexual to be covered by the law?
For criminal law purposes, the strongest protections apply to intimate images. But non-intimate photos shared to harass, humiliate, or harm you are covered under harassment law and data protection law. If sharing the photo can be seen as harassment, it could be illegal under the Protection from Harassment Act 1997.
What if I originally consented to the photo being taken but not to it being shared?
The crucial factor remains consent. Sharing intimate images without permission constitutes a criminal offence regardless of whether they were initially taken consensually or if the relationship status has changed. Consenting to a photo being taken is not the same as consenting to it being shared. The law is explicit on this.
Can I get compensation if someone shared my photo without permission?
Victims may be able to demand the removal of unauthorised images via takedown requests and pursue compensation for emotional or reputational harm caused. A solicitor specialising in privacy law can advise on the merits of a civil claim based on your specific circumstances.
What is the Revenge Porn Helpline, and is it really free?
The Revenge Porn Helpline is a UK charity providing free specialist support for victims of non-consensual intimate image sharing. They work directly with platforms to remove content and support victims through the reporting process. The Revenge Porn Helpline achieves approximately 90% success in content removal. Call 0345 6000 459, Monday to Friday 10am–4pm, or visit revengepornhelpline.org.uk.
What is a deepfake, and is it covered by UK law?
A deepfake is an AI-generated image that realistically depicts a real person in a situation that never happened — often intimate in nature. As of 6 February 2026, the Data (Use and Access) Act 2025 makes creating or requesting the creation of purported intimate images of an adult without consent a criminal offence. Creating a deepfake of you without consent is now explicitly illegal in the UK, as is sharing it.
Key Contacts — Save These Now
| Organisation | What they do | Contact |
|---|---|---|
| Revenge Porn Helpline | Free removal of intimate images | 0345 6000 459 · revengepornhelpline.org.uk |
| Internet Watch Foundation | Removes illegal content from the internet | iwf.org.uk/report |
| Action Fraud | Report image-based fraud or extortion | actionfraud.police.uk · 0300 123 2040 |
| Ofcom | Escalate if platforms fail to act | ofcom.org.uk/online-safety |
| NSPCC | If anyone under 18 is involved | 0808 800 5000 |
| Citizens Advice | Free legal guidance on your options | citizensadvice.org.uk |
| ICO | Report commercial misuse of your image | ico.org.uk |
| Police | Criminal report | 101 · or 999 in immediate danger |
This article provides general information about UK law as of March 2026. It does not constitute legal advice. If you need advice specific to your situation, contact a qualified solicitor or Citizens Advice.
