Historical images made with AI recycle colonial stereotypes and bias – new research
Generative AI is known to mirror sexist and racist stereotypes, but it also carries a colonial bias that is reinforcing outdated ideas about the past.
Olli Hellmann, Associate Professor of Political Science, University of Waikato
24 October 2025
National Museum - Tepapa
Generative AI has revolutionised how we make and consume images. Tools such as Midjourney, DALL-E and Sora can now conjure anything, from realistic photos to oil-like paintings – all from a short text prompt.
These images circulate through social media in ways that make their artificial origins difficult to discern. But the ease of producing and sharing AI imagery also comes with serious social risks.
Studies show that by drawing on training data scraped from online and other digital sources, generative AI models routinely mirror sexist and racist stereotypes – portraying pilots as men, for example, or criminals as people of colour.
My soon-to-be-published new research finds generative AI also carries a colonial bias.
When prompted to visualise Aotearoa New Zealand’s past, Sora privileges the European settler viewpoint: pre-colonial landscapes are rendered as empty wilderness, Captain Cook appears as a calm civiliser, and Maori are cast as timeless, peripheral figures.
As generative AI tools become increasingly influential in how we communicate, such depictions matter. They naturalise myths of benevolent colonisation and undermine Maori claims to political sovereignty, redress and cultural revitalisation.
‘Sora, what did the past look like?’
To explore how AI imagines the past, OpenAI’s text-to-image model Sora was prompted to create visual scenes from Aotearoa New Zealand’s history, from the 1700s to the 1860s.
The prompts were deliberately left open-ended – a common approach in critical AI research – to reveal the model’s default visual assumptions rather than prescribe what should appear.
Because generative AI systems operate on probabilities, predicting the most likely combination of visual elements based on their training data, the results were remarkably consistent: the same prompts produced near-identical images, again and again.
Two examples help illustrate the kinds of visual patterns that kept recurring.
Sora-generated image from the prompt ‘New Zealand in the 1700s’.
In Sora’s vision of “New Zealand in the 1700s”, a steep forested valley is bathed in golden light, with Maori figures arranged as ornamental details. There are no food plantations or pa fortifications, only wilderness awaiting European discovery.
This aesthetic draws directly on the Romantic landscape tradition of 19th-century colonial painting, such as the work of John Gully, which framed the land as pristine and unclaimed (so-called terra nullius) to justify colonisation.
Sora-generated image from the prompt ‘a Maori in the 1860s’.
When asked to portray “a Maori in the 1860s”, Sora defaults to a sepia-toned studio portrait: a dignified man in a cloak, posed against a neutral backdrop.
The resemblance to cartes de visite photographs of the late 19th century is striking. Such portraits were typically staged by European photographers, who provided props to produce an image of the “authentic native”.
It’s revealing that Sora instinctively reaches for this format, even though the 1860s were defined by armed and political resistance by Maori communities, as colonial forces sought to impose British authority and confiscate land.
Recycling old sources
Visual imagery has always played a central role in legitimising colonisation. In recent decades, however, this colonial visual regime has been steadily challenged.
As part of the Maori rights movement and a broader historical reckoning, statues have been removed, museum exhibitions revised, and representations of Maori in visual media have shifted.
Yet the old imagery has not disappeared. It survives in digital archives and online museum collections, often de-contextualised and lacking critical interpretation.
And while the precise sources of generative AI training data are unknown, it is highly likely these archives and collections form part of what systems such as Sora learn from.
Generative AI tools effectively recycle those sources, thereby reproducing the very conventions that once served the project of empire.
But imagery that portrays colonisation as peaceful and consensual can blunt the perceived urgency of Maori claims to political sovereignty and redress through institutions such as the Waitangi Tribunal, as well as calls for cultural revitalisation.
By rendering Maori of the past as passive, timeless figures, these AI-generated visions obscure the continuity of the Maori self-determination movement for tino rangatiratanga and mana motuhake.
An AI-generated social media post visualising history from a Maori perspective.Facebook
AI literacy is the key
Across the world, researchers and communities are working to decolonise AI, developing ethical frameworks that embed Indigenous data sovereignty and collective consent.
Yet visual generative AI presents distinct challenges, because it deals not only in data but also in images that shape how people see history and identity. Technical fixes can help, but they each have their limitations.
Extending datasets to include Maori-curated archives or images of resistance might diversify what the model learns – but only if done under principles of Indigenous data and visual sovereignty.
Addressing the bias in algorithms could, in theory, balance what Sora shows when prompted about colonial rule. But defining “fair” representation is a political question, not just a technical one.
Filters might block the most biased outputs, but they can also erase uncomfortable truths, such as depictions of colonial violence.
Perhaps the most promising solution lies in AI literacy. We need to understand how these systems think, what data they draw on, and how to prompt them effectively.
Approached critically and creatively – as some social media users are already doing – AI can move beyond recycling colonial tropes to become a medium for re-seeing the past through Indigenous and other perspectives.
Olli Hellmann does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
This article is republished from The Conversation under a Creative Commons license.