Global progressive policing

Tackling Image-Based Sexual Abuse: Designing new legislation and strategies to prosecute perpetrators, protect victims, and improve preventative measures

Online

19th Jun 2025 to 19th Jun 2025

Back to search results

Date of Event: Thursday, June 19th 2025

Time of Event: 9:30 AM — 1:00 PM BST

Place of Event: Webinar

In 2023 alone, the Revenge Porn Helpline reported nearly 19,000 cases of abuse, an increase from just 1,600 cases in 2019. Deepfake-related abuse has surged by 400% since 2017, with over 99% of these creations targeting women and girls. Child safety experts warn, furthermore, that AI is now being used to generate such volumes of sexually explicit images of children by predators that it is overwhelming the ability of the police to identify and rescue real-life victims.

In its general election manifesto, the Labour government pledged to halve levels of violence against women and girls within a decade, including by introducing specialist rape and sexual offences teams in every police force and using tactics normally reserved for terrorists and organised crime; introducing domestic abuse experts in 999 control rooms so that victims can talk directly to a specialist, and ensuring there is a legal advocate in every police force area to advise victims from the moment of report to trial. Labour also committed to banning the creation of sexually explicit deepfakes, strengthen the use of Stalking Protection Orders and give women the right to know the identity of online stalkers, ensuring schools address misogyny and teach young people about healthy relationships and consent, and strengthening the rights of whistleblowers in the workplace in relation to sexual harassment. Furthermore, under government plans to update the Online Safety Act, the sharing of intimate images without consent will be made a ‘priority offence’ and social media firms will have to proactively remove and stop this material appearing on their platforms.

In September 2024, the End Violence Against Women Coalition, #NotYourPorn, GLAMOUR UK and Clare McGlynn, Professor of Law at Durham University launched a campaign to protect women online, led by Jodie* a survivor of deepfake sexual abuse. The campaign calls for a dedicated Image-Based Abuse law to protect women’s rights online and offline, with measures that: strengthen criminal laws on creating, taking and sharing intimate images without consent (including deepfakes); improve civil laws for survivors to take action against perpetrators and tech companies (including taking down abusive content); preventing image-based abuse through comprehensive relationships, sex and health education; funding specialist services that provide support to victims and survivors of image-based abuse; and creating an online abuse commission to hold tech companies accountable for image-based abuse. Planned new laws to address deepfake sexual abuse fell through due to the general election in 2024, meaning that the creation of such imagery remains legal, leaving women and girls at risk and survivors without a route to justice.

This symposium will provide an invaluable opportunity for stakeholders – including law enforcement officers, e-crime and e-safety professionals, charities, and others – to review existing legislation relating to IBSA, examine the new Labour government’s plans in this area, and discuss further ways to tackle and deter IBSA and assess what further measures could be implemented. Delegates will also explore methods to raise awareness of the harms of such abuse and develop strategies to support survivors.

Programme

  • Learn about and assess trends in and drivers of image-based sexually abuse, government plans to tack it, challenges, and avenues for improvement
  • Evaluate the impact of the Online Safety Act 2023 on IBSA and how the Act can be effectively implemented
  • Evaluate what a dedicated law specifically aimed at tackling IBSA and protecting women’s rights online and offline should look like
  • Develop strategies to tackle the rise and spread of AI-generated sexually explicit imagery
  • Exchange views on how social media platforms, search engines and other websites or apps which host user-generated can be made to do more to tackle IBSA
  • Address the challenges of protecting those groups particularly vulnerable to IBSA, including children, minoritised women and sex workers and develop strategies for improving multi-agency cooperation around protecting vulnerable individuals  
  • Implement specialist services that provide support to victims and survivors of IBSA

Who Should Attend?

  • Domestic Abuse Support Workers
  • Third Sector Practitioners
  • Central Government Departments and Agencies
  • Police Services
  • Women’s Aid Groups
  • Headteachers, Educators and Teachers
  • National helplines and online support services
  • Stalking and Harassment Specialists
  • Domestic Violence Co-ordinators
  • Local Criminal Justice Boards
  • Victim Support Representatives
  • Counselling Services
  • Sexual Assault Support Centres and Specialists
  • Independent Domestic Violence Advisors
  • Independent Sexual Violence Advisors
  • Women’s Sector Practitioners
  • Criminal Justice Practitioners
  • Judges and Magistrates
  • Legal Professionals
  • Police and Crime Commissioners
  • Police Community Support Officers
  • Children’s Services and Families Services Officers
  • Social Workers and Social Services Officers
  • Local Safeguarding Children Boards
  • Child and Adolescent Mental Health Practitioners
  • E-Safety Co-ordinators
  • E-Crime Experts
  • Social Networking Providers
  • Internet Service Provider Executives
  • Youth Workers and Youth Offending Teams
  • Probation Officers
  • Anti-Social Behaviour Coordinators
  • Community Support Officers
  • Local Authority Councillors and Officers
  • Central Government Departments and Agencies
  • Academics and Researchers
More information

Back to search results

Top