top of page

New: Deepfake Nudes and Why Google Must Act


The world may finally be waking up to the reality of deepfake porn, with news last week that young boys in Spain used AI nudify apps to turn ordinary social media images of their female school-friends into fake nudes, all without the girls’ consent or knowledge.

You can read more about this case and what needs to be done in my new blog with Fiona Vera-Gray: Fake Porn, Real Victims - We must stop the easy use of AI to create nude images of women & girls


Millions have been using these websites and apps powered by AI to generate deepfake porn/nudes – to ‘nudify’ images of clothed women and girls – with little consequence. While the Spanish police are investigating this case, criminal sanctions must not be the only focus.


We argue that we must set our sights on the search engines like Google which highly rank these websites and the mainstream porn industry which legitimises and normalises non-consent. The EU must also ensure that it’s draft directive on violence against women and girls is comprehensive, as I have argued in my research with Carlotta Rigotti.


I also contributed to a report by the US based ABC News calling for Google to de-rank deepfake and deepnude websites:



Change is happening. Following the ABC News report, Paypal and Visa have removed their services from the app used in the Spanish case. Positive first steps.

Further information:

  • Read more about my research on deepfake porn and what legal and policy changes are needed to challenge this abuse here.

  • Watch the powerful new documentary on the realities and harms of deepfake porn featuring #NotYourPorn, politician Cara Hunter and myself here.

  • Join the campaign by MyImageMyChoice to get action against website MrDeepfakePorn here.

bottom of page