News | Living & Travel
12 Jan 2026 8:34
NZCity News
NZCity CalculatorReturn to NZCity

  • Start Page
  • Personalise
  • Sport
  • Weather
  • Finance
  • Shopping
  • Jobs
  • Horoscopes
  • Lotto Results
  • Photo Gallery
  • Site Gallery
  • TVNow
  • Dating
  • SearchNZ
  • NZSearch
  • Crime.co.nz
  • RugbyLeague
  • Make Home
  • About NZCity
  • Contact NZCity
  • Your Privacy
  • Advertising
  • Login
  • Join for Free

  •   Home > News > Living & Travel

    X is facilitating nonconsensual sexual AI-generated images. The law – and society – must catch up

    These images are being shared in an attempt to harass, demean or silence individuals.

    Giselle Woodley, Lecturer and Research Fellow in Communications, Edith Cowan University, Nicola Henry, Professor, Australian Research Council Future Fellow, & Deputy Director, Social Equity Research Centre, RMIT University
    The Conversation


    X (formerly Twitter) has become a site for the rapid spread of artificial intelligence-generated nonconsensual sexual images (also known as “deepfakes”).

    Using the platform’s own built-in generative AI chatbot, Grok, users can edit images they upload through simple voice or text prompts.

    Various media outlets have reported that users are using Grok to create sexualised images of identifiable individuals. These have been primarily of women, but also children. These images are openly visible to users on X.

    Users are modifying existing photos to depict individuals as unclothed or in degrading sexual scenarios, often in direct response to their posts on the platform.

    Reports say the platform is currently generating one nonconsensual sexualised deepfake image a minute. These images are being shared in an attempt to harass, demean or silence individuals.

    A former partner of X owner Elon Musk, Ashley St Clair, said she felt “horrified and violated” after Grok was used to create fake sexualised images of her, including of when she was a child.

    Here’s where the law stands on the creation and sharing of these images – and what needs to be done.

    Image-based abuse and the law

    Creating or sharing nonconsensual, AI-generated sexualised images is a form of image-based sexual abuse.

    In Australia, sharing (or threatening to share) nonconsensual sexualised images of adults, including AI-generated images, is a criminal offence under most Australian state, federal and territory laws.

    But outside of Victoria and New South Wales, it is not a criminal offence to create AI-generated, nonconsensual sexual images of adults or to use the tools to do so.

    It is a criminal offence to create, share, access, possess and solicit sexual images of children and adolescents. This includes fictional, cartoon or AI-generated images.

    The Australian government has plans underway to ban “nudify” apps, with the United Kingdom following suit. However, Grok is a general-purpose tool rather than a purpose-built nudification app. This places it outside the scope of current proposals targeting tools designed primarily for sexualisation.


    Read more: Australia set to ban 'nudify' apps. How will it work?


    Holding platforms accountable

    Tech companies should be made responsible for detecting, preventing and responding to image-based sexual abuse on their platforms.

    They can ensure safer spaces by implementing effective safeguards to prevent the creation and circulation of abusive content, responding promptly to reports of abuse, and removing harmful content quickly when made aware of it.

    X’s acceptable use policy prohibits “depicting likenesses of persons in a pornographic manner” as well as “the sexualization or exploitation of children”. The platform’s adult content policy stipulates content must be “consensually produced and distributed”.

    X has said it will suspend users who create nonconsensual AI-generated sexual images. But post-hoc enforcement alone is not sufficient.

    Platforms should prioritise safety-by-design approaches. This would include disabling system features that enable the creation of these images, rather than relying primarily on sanctions after harm has occurred.

    In Australia, platforms can face takedown notices for image-based abuse and child sexual abuse material, as well as hefty civil penalties for failure to remove the content within specified timeframes. However, it may be difficult to get platforms to comply.

    What next?

    Multiple countries have called for X to act, including implementing mandatory safeguards and stronger platform accountability. Australia’s eSafety Commissioner Julie Inman Grant is seeking to shut down this feature.

    In Australia, AI chatbots and companions are noted for further regulation. They are included in the impending industry codes designed to protect users and regulate the tech industry.

    Individuals who intentionally create nonconsensual sexual deepfakes play a direct role in causing harm, and should be held accountable too.

    Several jurisdictions in Australia and internationally are moving in this direction, criminalising not only the distribution but also the creation these images. This recognises harm can occur even in the absence of widespread dissemination.

    Individual-level criminalisation must be accompanied by proportionate enforcement, clear intent thresholds and safeguards against overreach, particularly in cases involving minors or lack of malicious intent.

    Effective responses require a dual approach. There must be deterrence and accountability for deliberate creators of nonconsensual sexual AI-generated images. There must also be platform-level prevention that limits opportunities for abuse before harm occurs.

    Some X users are suggesting individuals should not upload images of themselves to X. This amounts to victim blaming and mirrors harmful rape culture narratives. Anyone should be able to upload their content without being at risk of having their images doctored to create pornographic material.

    Hugely concerning is how rapidly this behaviour has become widespread and normalised.

    Such actions indicate a sense of entitlement, disrespect and lack of regard for women and their bodies. The tech is being used to further humiliate certain populations, for example sexualising images of Muslim women wearing the hijab, headscarfs or tudungs.

    The widespread nature of the Grok sexualised deepfakes incident also shows a universal lack of empathy and understanding of and disregard for consent. Prevention work is also needed.

    If you or someone you know has been impacted

    If you have been impacted by nonconsensual images, there are services you can contact and resources available.

    The Australian eSafety Commissioner currently provides advice on Grok and how to report harm. X also provides advice on how to report to X and how to remove your data.

    If this article has raised issues for you, you can call 1800RESPECT on 1800 737 732 or visit the eSafety Commissioner’s website for helpful online safety resources.

    You can also contact Lifeline crisis support on 13 11 14 or text 0477 13 11 14, Suicide Call Back Services on 1300 659 467, or Kids Helpline on 1800 55 1800 (for young people aged 5–25). If you or someone you know is in immediate danger, call the police on 000.

    The Conversation

    Giselle Woodley receives funding from the Australian Research Council for her research, the Australian Human Rights Commission, as an expert advisor on the "on your terms" consent survey, and the Daniel Morcombe Foundation for guest speaking at events concerning children and young people's online safety.

    Nicola Henry receives funding from the Australian Government Department of Social Services. She is also a member of the Australian eSafety Commissioner’s Expert Advisory Group.

    This article is republished from The Conversation under a Creative Commons license.
    © 2026 TheConversation, NZCity

     Other Living & Travel News
     12 Jan: Efforts to uncover how a racing greyhound became exposed to meth - have come up short
     11 Jan: The Breakers require an instant reset for this afternoon's Australian basketball league trip to Illawarra
     11 Jan: Maketu Pies has bounced back from the brink of closure
     10 Jan: Costly change of plans for some holiday makers, stuck on the wrong side of the Cook Strait
     10 Jan: Surfing is the leading cause of water-related injuries in New Zealand every summer
     09 Jan: Ice Paks sold at Kmart are being recalled over toxin fears
     09 Jan: The Breakers are focusing on performance, not their lowly position on the Australian basketball league ladder
     Top Stories

    RUGBY RUGBY
    New Zealand Giant Slalom skiing star Alice Robinson isn't switching focus despite surprise showings in a different discipline More...


    BUSINESS BUSINESS
    Rising living costs are driving up both the number of first-time student loan borrowers, and the size of their loans More...



     Today's News

    Entertainment:
    The Kardashians don't have a one-size-fits-all approach to relationships, according to Khloe Kardashian 8:23

    Business:
    Rising living costs are driving up both the number of first-time student loan borrowers, and the size of their loans 8:17

    Basketball:
    Breakers forward Rob Baker has responded to his coach's defensive criticism in basketball's ANBL 8:07

    International:
    In Minneapolis, the deadly ICE shooting has reopened old wounds 7:57

    Business:
    The Government says a move to "open electricity" will give consumers more choice through better transparency 7:57

    Entertainment:
    Jodie Foster finds working with female directors to be a "really amazing" experience 7:53

    Accident and Emergency:
    A weekend of people getting into trouble in the water has spurred renewed calls for caution 7:47

    Business:
    The Government hopes the roll-out of "open electricity" makes it easier for Kiwis to get cheaper power 7:37

    Accident and Emergency:
    There have still been no sightings of a man who went missing in the Waikato River over the weekend 7:27

    Entertainment:
    Ali Wong and Bill Hader have split 7:23


     News Search






    Power Search


    © 2026 New Zealand City Ltd