PARLIAMENTARY DEBATE
Tackling Image-based Abuse - 12 November 2024 (Commons/Westminster Hall)
Debate Detail
That this House has considered the matter of tackling image-based abuse.
It is a pleasure to serve under your chairship, Mr Vickers. I declare an interest as a member of the Women and Equalities Committee. I am bringing this motion before the House to maintain the steady pressure from campaigners and parliamentarians on an issue that is both urgent and often neglected: image-based sexual abuse, which is a form of violence that overwhelmingly affects women and girls.
Today, I aim to shed light on where our legislation on image-based sexual abuse is falling short and to propose three reforms that this Labour Government can deliver. This will build on the fine work conducted by Members across the House—including the Minister for safeguarding and violence against women and girls, my hon. Friend the Member for Birmingham Yardley (Jess Phillips); Madam Deputy Speaker, the right hon. Member for Romsey and Southampton North (Caroline Nokes); the hon. Member for Gosport (Dame Caroline Dinenage) and current and previous members of the Women and Equalities Committee—as well as Members of the other place.
Image-based sexual abuse encompasses a wide range of violations, from digitally altered images such as deepfakes to invasive acts such as upskirting, downblousing and so-called revenge porn. In an increasingly digital world, this abuse—this violence—is an escalating crisis.
Drawing on two powerful accounts that have profoundly shaped my own perspective, I will highlight the three glaring flaws that we must confront. The first is the failure to ensure the permanent removal of abusive content, which leaves survivors chained to their trauma. The second is the weak regulatory enforcement that allows platforms to shrug off their responsibilities. The third is the lack of civil remedies for survivors, a lifeline that we know to be critical to restoring dignity, control and hope.
I will not have the space today to discuss how we can prevent online violence against women and girls by embedding it into the relationships, sex and health education curriculum, to which the hon. Member for Strangford (Jim Shannon) alluded, or how proceeds from the digital services tax and Ofcom fines could sustainably fund lifesaving support services for victims. However, those issues loom large in the debate.
I am grateful that the Minister for victims, my hon. Friend the Member for Pontypridd (Alex Davies-Jones), is present. I look forward to hearing how tackling image-based abuse aligns with this Government’s unprecedented commitment to halving violence against women and girls. I also hope to hear from the Secretary of State for Science, Innovation and Technology on these issues. In the UK, we face an escalating crisis of image-based sexual abuse. Every week, new victims emerge and women and girls lose their right to control their most intimate images.
In 2023 alone, the Revenge Porn Helpline reported nearly 19,000 cases of abuse, a staggering increase from just 1,600 cases in 2019. Deepfake-related abuse has surged by 400% since 2017, with over 99% of these vile creations targeting women and girls. The numbers are shocking, but they are more than statistics. Behind each one is a life and a human story—another innocent person whose confidence, relationships and sense of safety is shattered. Survivors often describe their experience as digital rape, a term that captures the intensely personal and profoundly scarring nature of this violation.
Just two weeks ago, the escalating crisis hit home in my constituency of Bolton North East with the case of Hugh Nelson, who was sentenced at Bolton Crown court to 18 years in prison for creating and distributing depraved sexual images using artificial intelligence. Detective Chief Inspector Jen Tattersall of Greater Manchester police described Nelson as
“an extremely dangerous man who thought he could get away with what he was doing by using modern technology.”
Yet Nelson’s sentencing is something of an exception. Too many perpetrators remain beyond the reach of justice, shielded by gaps in our legal framework. This reality raises a question: has our response truly kept pace with the escalating scale of this crisis? Are we really doing all we can to support victims and survivors?
We must also do better to protect male victims who reach out to the Revenge Porn Helpline. It is time we prioritised victims. We must not let technology develop without the necessary safeguards to protect us all from harm. I was alarmed to hear last week that online platforms do not take images down while they are reviewing their harmfulness; that practice simply exacerbates the harm that victims face. It is vital we ensure that image-based abuse does not get lost in the excitement of this Government’s new, packed legislative agenda. It is time that the legislation recognised adult non-consensual intimate images as illegal content, in the same way that abusive images of children are so considered. The Online Safety Act 2023—
I do not believe that in their 14 years the previous Government did anywhere near enough to tackle the issue. I can already see the Labour Government taking decisive steps to change the answer to the question of whether we are doing enough. I welcome the Government’s manifesto commitment to ban the creation of sexually explicit deepfakes, an essential step in safeguarding women and girls from malicious technology. I am encouraged by the collaborative work under way among the Department for Science, Innovation and Technology, the Home Office and the Ministry of Justice to identify a legislative vehicle to ensure that those who create these images without consent are held accountable. I am also pleased that new changes to the Online Safety Act will make image-based abuse a priority offence.
Although those are positive steps, they represent only modest progress. As experts such as End Violence Against Women and the #NotYourPorn campaign have pointed out, sharing intimate images without consent was already prioritised under the Online Safety Act. So far, the changes under this Government have been merely administrative and merely incremental. Having listened to survivors of image-based abuse, I urge the Minister to agree that this is no time for incremental change.
Georgia Harrison is a courageous campaigner who shared her story with the Women and Equalities Committee. Georgia’s images were distributed without her consent, leading to years of harassment, scrutiny and anguish. Even after her abuser was convicted, Georgia continued to see her images circulate online—a haunting reminder that, as she has stated, her life will never be the same again.
Another survivor is “Jodie”, who bravely spoke to the BBC about the trauma of being deepfaked by someone she once considered her best friend. Jodie discovered that images from her private Instagram account had been overlaid on pornographic material and posted across Reddit and other forums, with users invited to rate her body. Jodie endured this abuse for five years. She recalls:
“I felt alone. The emotional toll was enormous. There were points I was crying so much I burst the blood vessels in my eyes. I couldn’t sleep and when I did, I had nightmares.”
In Jodie’s case, the perpetrator was asking others to create explicit images of her, revealing a shameful grey area in our current legislation. That is why Jodie, along with campaign partners the End Violence Against Women coalition, Glamour and #NotYourPorn, is calling for an image-based abuse law.
Speaking as a mother, I cannot imagine having my child endure such horror. I am grateful that Baroness Owen of Alderley Edge has introduced a private Member’s Bill in the other place to address this gap. She has done a great deal of work on the issue, keeping victims like Georgia and Jodie at the heart of her Bill.
Georgia and Jodie’s experiences underscore three critical flaws in the Online Safety Act. The first is the glaring failure to criminalise abusive images themselves. Georgia’s story illustrates this brutal oversight: despite her abuser’s conviction, the absence of a stay-down provision allows her images still to circulate online, forcing her to relive the trauma with each resurfacing. To quote Professor Clare McGlynn,
“every day these images remain online is another day of extreme suffering for victims.”
Survivors deserve certainty that once their abuse is addressed, it is addressed permanently.
A second flaw in the Act is its reliance on Ofcom, whose current enforcement powers lack the agility and speed needed for an online world in which, if one website is blocked, another can appear instantly. Initiatives such as the StopNCII.org campaign have revealed how social media platforms consistently outmanoeuvre Ofcom. This is effectively leaving tech giants to determine whether supporting survivors like Georgia serves their profit-driven interests. To close the enforcement gaps, I stand with the End Violence Against Women coalition, Glitch and others in calling for a national online abuse commission —a dedicated body to champion the rights of victims and survivors of online abuse.
Finally, our legislation fails survivors by denying them accessible civil remedies—such as immediate take-downs and compensation for emotional harm—outside the criminal process. For survivors such as Jodie who have endured years of abuse, the inability to seek swift relief without a lengthy, retraumatising trial is a devastating gap. Creating a statutory civil offence for image-based abuse would not only empower survivors to seek redress directly against perpetrators and platforms, but give them that all-important second chance. The Minister will know that organisations such as South West Grid for Learning and the UK Safer Internet Centre consider civil remedies as much-needed lifelines for survivors. I wholeheartedly agree.
Today, through Georgia and Jodie’s stories, we have seen the devastating cost of our inaction on the escalating, ever-evolving crisis of image-based abuse. For too long, our legislation has had three glaring deficiencies: the absence of a stay-down provision, the lack of an online abuse commission and the unavailability of civil remedies.
Returning to my earlier questions, I want to be able to tell survivors that this Government are doing everything possible to support them. I want to reassure them that our Ministers are responding in real time to the scale and urgency of the crisis. With every day we delay, more women and girls are thrust into cycles of harm without the protections that they urgently need and deserve. I look forward to hearing from the Minister exactly how we will deliver this assurance. I would also be grateful if I could discuss the matter further with the Secretary of State for Science, Innovation and Technology at the earliest opportunity.
Let us not wait another day to act. Survivors need real action, not just incremental change. We owe it to Georgia, Jodie and all those who have suffered.
I declare an interest: being very elderly, at one stage I was the Minister for Women and Equalities, and I was responsible for bringing forward the Revenge Porn Helpline. When that legislation came through, I was hopeful that that vital resource would be something temporary, and that one day it would be abolished because we did not need it any more. In actual fact, quite the opposite is true: it is busier than ever. As the hon. Member for Bolton North East said, it is catching some terrible perpetrators of the most horrific online abuse.
I was also the Minister for Digital and Culture who held the baton for a couple of years on the Online Safety Act 2023. I hope that legislation will offer more protection for the victims of this humiliating crime, which, as we know, disproportionately affects women. But technology moves so fast. I am concerned that, despite the protections in the Act and the Revenge Porn Helpline, the emergence of deepfakes in particular has opened up a new front in the war on women—I say that because 99% of pornographic images and deepfakes are of women. Literally tens of millions of deepfake images are being produced every year, most of them sexual, as the hon. Member for Bolton North East said.
The fact that the use of nudification apps and the creation of ultra-realistic deepfake porn for private use is still legal, and worse, becoming more popular, is a war on women’s autonomy. It is a war on our dignity, and a war on our identity. The creation of these unpleasant sexual or nude deepfakes serves to push us out of those spaces and to undermine and silence us, both online and offline. We must do everything we can to stand against it.
I am sure the Minister agrees that we owe a debt to my noble Friend Baroness Owen of Alderley Edge for her work to legislate in this area. She encountered some hate of her own when she was appointed to the other place—she was too young and too female—but her very presence there indicates exactly why we need young women in both Houses to stand up against these injustices and bring them to the fore. I hope the Minister will do everything in her power to see my noble Friend’s fantastic Bill become law.
This Government are absolutely committed to tackling violence against women and girls, and to restoring trust so that victims know that the justice system sees them, hears them and takes them seriously. In our election manifesto, we promised to make tackling violence against women and girls a political priority—finally, after years of neglect—with a pledge to halve violence against women and girls over the next decade. It is an ambitious target, but I believe we can do it.
Tackling online abuse is crucial. As outlined so eloquently by my hon. Friend the Member for Bolton North East, the statistics are clear, but behind them are real people—real victims. Many of us will have experienced it ourselves, or know friends or family who have. Women have the right to feel safe in every space, online and offline. The rise in intimate image abuse is utterly devastating for victims, but it also spreads misogyny on social media, which can develop into potentially dangerous relationships offline. It is truly an abhorrent crime, which is why the Government are determined to act. It will not be easy and we are just at the start, but we will use all the tools available to us to tackle it.
Let me set out some of the work we are doing right now. First, it is vital that our criminal law is equipped to deal effectively with this behaviour. A range of criminal offences tackle intimate image abuse, whether online or offline. That includes offences of voyeurism and sharing or threatening to share intimate images without consent. However, the current law has developed in piecemeal fashion, with new offences introduced over many years to address different forms of offending. The result is a patchwork of offences with known gaps in protection for victims. For example, while it is currently an offence to share a deepfake of an intimate image without consent, it is not an offence to make one. That is why the Government’s manifesto included a commitment to ban the creation of degrading and harmful sexually explicit deepfakes. This is not porn; this is abuse. We are looking at options to swiftly deliver that commitment in this Session of Parliament. We will consider what further legislative measures may be needed to strengthen the law in this area.
While intimate image abuse rightly has serious criminal consequences, we also need to tackle the prevalence of such content online. That is why, on 12 September, we laid before the House a statutory instrument to add the new criminal offences of sharing or threatening to share intimate images to the list of priority offences under the Online Safety Act. This strengthens the duties on providers to prioritise tackling intimate image abuse under the Act by holding them responsible for stopping the spread.
Ofcom is working on the illegal harms codes of practice, which will take effect next year, and already working with the tech companies to ensure that the Online Safety Act is implemented quickly and effectively. Firms will also need to start risk assessing for that illegal content by the end of this year. Ofcom will have robust enforcement powers available to use against the companies that fail to fulfil their duties. It will be able to issue enforcement decisions that may include fines of up to £18 million or 10% of qualifying global revenue in the relevant year—whichever is higher. The Online Safety Act also means that when users report illegal intimate image abuse content to the platforms, they will be required to have systems and processes in place to remove the content.
It is important that the police respond robustly to such crimes. We have heard the importance of that today. In our manifesto, we committed to strengthening police training on violence against women and girls. We must ensure that all victims of VAWG have a positive experience when dealing with the police. That is essential to increased reporting of these crimes and delivery of better outcomes for victims. We will work closely with the College of Policing and the National Police Chiefs’ Council to improve and strengthen training for officers. This is a start, but I am clear that it is not the be-all and end-all of tackling intimate image abuse. We can and must do more. If we want to see true and lasting change, we need a culture shift. I have said this before and I will keep saying it: we need everyone, especially men, to play their part in slowly but surely, bit by bit, wearing away outdated views and misogyny to ensure women are safe, wherever they are.
Finally, we need to ensure that when someone has been the victim of intimate image abuse, they get the support that they need and know that they as victims and survivors have done nothing wrong. A key part of that is the invaluable work of victim support organisations such as the intimate image abuse helpline, which is funded by Government and was set up by the hon. Member for Gosport (Dame Caroline Dinenage). Not only do these services provide high quality support and advice to victims of intimate image abuse, but they work with law enforcement and others to improve the response to these awful crimes. Representatives from the helpline recently gave evidence to the Women and Equalities Committee on this very issue, and I am grateful to them for all that they do to support victims. Their work is more valuable and more needed than ever.
On victim support, the Ministry of Justice funds many other services to help victims cope and recover from the impact of crime. For example, we have the rape and sexual abuse support fund, which supports more than 60 specialist support organisations. As others have mentioned, we also have Refuge, which the Government fund to deliver a specific tech abuse function. It has been at the forefront of the response to tech abuse. We also provide police and crime commissioners with annual grant funding to commission local, practical, emotional and therapeutic support services for victims of all crime types, not just intimate image abuse.
The Victims and Prisoners Act 2024 will aim to improve support to victims of sexual abuse, including intimate image abuse, by placing a duty on local commissioners to collaborate when commissioning support services so that victims and survivors get the support that they actually need. That brings me back to the key point: collaboration, with everyone pulling together and playing their part. That is what we need if we are going to truly see a shift. Again, I thank my hon. Friend the Member for Bolton North East for securing the debate and I thank everyone for coming and showing support. It really is important that we have good representation in Parliament. We are absolutely committed to tackling violence against women and girls, as are this Government, and we are just at the start of it.
Question put and agreed to.
Contains Parliamentary information licensed under the Open Parliament Licence v3.0.