The Ethics of AI Representation: Why Authenticity Matters for First Nations People

Indigenous Business Australia's use of AI-generated imagery for their NAIDOC Week promotion this week has highlighted the need for us to talk more about digital ethics and authentic representation in communications, particularly in relation to Aboriginal and Torres Strait Islander peoples (hereinafter referred to interchangeably as Blackfullas, Blak people, Indigenous people, First Nations people, and mob).

While the post of three obviously AI-generated Blackfullas (which were later described as "faces based on some of our staff, a mix of Aboriginal, Torres Strait Islander, and Samoan individuals") and a caption that showed all the telltale signs of AI-generated text might seem like an isolated incident, it reflects a broader and troubling pattern in how First Nations people are communicated about and for.

This pattern demands our attention and action.

A Pattern of Whitewashing Representations of Blakness

I can't pinpoint exactly when I first noticed professional organisations needlessly using poorly constructed, AI-generated images of First Nations people. The use of AI to create what these programs perceive as how a Blackfulla should look, and related to this, what "Aboriginal art" is, has become an increasingly frequent point of discussion within my networks.

Mob have had conversations, both online and offline, about representation and the curation of our culture and identities by algorithms and programs grounded in non-Indigenous knowledge and perspectives. We've spoken about the level of disrespect it shows towards our sacred culture, ancestors, Elders and people, and how the use of AI-generated imagery presents Blak people, Blak art and content about us is a whitewashed and watered down, tokenistic reflection on how others see us, rather than how we are.

AI-generated imagery of First Nations people often reproduces colonial tropes embedded in the data it draws from. And, to clarify, this is not necessarily AI's fault.

The very nature of how AI, primarily open source AI, is constructed is that it learns from and is responsive to the information that it is given. This can be a positive thing in terms of being able to assist with tasks or drafting an email with your tone of voice and an already established understanding of the context of the content, however, when it comes to Blackfullas, it is fundamentally limited.

When using AI for communications, we must always keep that in the back of our minds, that is, that while the AI program can learn and respond accordingly, it does not have the ability to provide quality assurance on what we as First Nations people see as being an appropriate representation and respectful communications.

When we reflect on how these AI-generated representations fit into the broader historical and current context of how our people and stories are told, we begin to see just how much "art" continues to mirror long-standing racial and cultural narratives.

Dr Chelsea Watego, a Munanjahli and South Sea Islander academic, has written extensively about the othering of First Nations people in the mainstream. She argues that "the problem is not with how Blackfullas appear, but with how whiteness insists on seeing us."

Building on this insight, Dr Amy McQuire, a Darumbal and South Sea Islander writer and journalist, speaks emphatically about how non-Indigenous people, particularly the media, shape the depiction of the "Black Witness" in ways that erase sovereignty and reinforce settler myths about who we are allowed to be.

Professor Aileen Moreton-Robinson, a Goenpul woman of the Quandamooka people, in her book Talkin' Up to the White Woman, reminds us that "whiteness is not seen to be raced but operates as the normative subject position." This invisibility of whiteness means that systems like AI, which are built on dominant data and norms, do not simply ignore Blaknessโ€”they overwrite it.

These insights make it clear that AI doesn't just reflect bias. It automates it.

Using artificial intelligence to represent First Nations people is not simply about convenience. It perpetuates a system that has historically displaced, misrepresented, and spoken over our voices.

And if an organisation like Indigenous Business Australia, respected by many and widely seen as a peak body representing First Nations people, is using AI to depict and to speak about us, what kind of message does that send to others?

The Australian Context: A History of Misrepresentation

Dominant non-Indigenous voices, like those expressed by anthropologists in dated texts and contemporaneous media portrayals of Indigenous Australians have long been described by both academics and commentators as negative and stereotypical.

Non-Indigenous voices often dominate the narrative and frame Indigenous people as problems to be solved, rather than as sovereign people with agency, authority, and cultural depth. Studies have shown that Australian media frequently leans into tired tropes when reporting on Aboriginal issues.

We've spent decades trying to move past the "noble savage" image and the deficit-focused narratives that reduce our cultures to something broken or in need of fixing. While we push for nuance in conversations about us, the dominant narrative remains homogenous and not reflective of what we, as Blackfullas, know ourselves as.

When organisations use AI to generate our words, our faces, our stories and our identities, they are continuing a long-standing pattern. It's a pattern that speaks for us instead of with us, choosing the easy path of approximation over the harder but more meaningful path of genuine engagement.

To take it one step further, when an organisation like Indigenous Business Australia blatantly uses AI in their visual and written communications, they are implicitly legitimising this as an approach that can be taken when communicating to and about us. Their communications don't say we respect and value Blak people and voices, instead they indicate that grounding communications in real people and authenticity is less important than the time saved by using AI.

We, as First Nations people, often speak about the importance of decolonising many facets of modern society. Perhaps it's time we also turn our attention to the decolonisation of communication methodologies and outputs.

As our focus this coming week (NAIDOC Week) turns to legacy, among many other important themes, both Indigenous and non-Indigenous people should reflect on the role they are actively or passively playing in how Indigenous people and cultures are being represented and respected through visual and written communications in the digital age.

AI as a Tool, Not a Replacement

AI can be a valuable tool when used thoughtfully and with care.

I used it in a practical way while writing this piece. It helped me gather research, structure my thoughts, and refine my message. But I remained the author of this work and applied my own cultural lens, judgment, and lived experience throughout.

The issue isn't the technology itself. The problem arises when people use AI without care, without community, and without cultural oversight, particularly when creating representations of marginalised communities.

When that happens, what we end up seeing are shortcuts, not intention. We see content that lacks the care and cultural responsiveness that authentic representation requires.

A Time for Reflection and Action

NAIDOC Week 2025 is a time to celebrate the history, culture, and achievements of Aboriginal and Torres Strait Islander peoples. But it should also be a time for reflection.

For non-Indigenous people and organisations, this means taking a hard look at how you are representing First Nations people in your work, and how you are using new technologies to do so.

Ask yourself: Are you using AI in ways that amplify real Indigenous voices, or are you using it to replace them? Are you creating more opportunities for First Nations people to speak, or are you leaning on technology to stand in for genuine relationships and engagement?

The answers to these questions will help shape whether technology becomes a tool for greater inclusion or just another means of exclusion.

A Call to Action

As NAIDOC Week approaches, I challenge every organisation, agency and individual to commit to real and respectful representation. This includes:

  • Featuring actual First Nations people in your campaigns, communications, and content. If your organisation has images of mob, use them. If not, there are stock photo libraries that do. Canva, for instance, has commissioned a range of images from First Nations photographers that are available through its Pro subscription.

  • Building genuine relationships with First Nations communities rather than relying on technological shortcuts. Take the time to learn. Attend a talk, listen to a song, read a book. Let NAIDOC Week be the beginning of a deeper understanding.

  • Ensuring strong human oversight over any AI-generated content. That includes checking for cultural accuracy and sensitivity, and ideally, led by or involving First Nations people in the creation of materials about us.

  • Reflecting on your processes: Why are you using AI at all in this context? Is there a real reason you're not using real images, voices, or people? Think about how your communications will land with First Nations audiences, not just how they meet internal deadlines.

  • Owning your mistakes: If you get it wrong, and many of us will at some point, take responsibility. Apologise sincerely. Then act to make sure you do better next time.

At the time of writing this, Indigenous Business Australia has removed the AI posts from Instagram and Facebook and issued an apology. They explained that they were "pushing the boundaries with some fun AI" and didn't want to "stand in the background" while others were exploring new tools on social media.

That's all well and good but if I have to wonder if uploading an image from DALLยทE and a caption from ChatGPT really pushing any boundaries, other than the boundaries of what should and shouldn't be done with AI?

The technology exists to facilitate respectful, inclusive, and authentic representations of First Nations people in communications. The question is whether people will choose to use it in ways that centre our voices or continue to let algorithms define who we are.

Authenticity is not a trend. It's a principle grounded in justice, respect, and truth.

This NAIDOC Week, let's reflect on what we want our legacy to be and let's keep it real. Literally.

Next
Next

๐—”๐—œ, ๐—ฅ๐—ฒ๐—ฝ๐—ฟ๐—ฒ๐˜€๐—ฒ๐—ป๐˜๐—ฎ๐˜๐—ถ๐—ผ๐—ป, ๐—ฎ๐—ป๐—ฑ ๐˜๐—ต๐—ฒ ๐—™๐—น๐—ฎ๐˜๐˜๐—ฒ๐—ป๐—ถ๐—ป๐—ด ๐—ผ๐—ณ ๐—œ๐—ป๐—ฑ๐—ถ๐—ด๐—ฒ๐—ป๐—ผ๐˜‚๐˜€ ๐—–๐˜‚๐—น๐˜๐˜‚๐—ฟ๐—ฒ๐˜€