Whose artificial intelligence? Reflecting on the intersection of AI and Te Ao Māori
This article was written by Lauren Skogstad | Head of Experience | Ngāti Raukawa who presented on this topic at Design Assembly’s Design & AI Autumn conversations event in Pōneke last month. To check out a recap of this year’s Autumn Conversations event series held across Aotearoa NZ click here.
Interested to learn more about AI and Design? Check out some of the additional AI & Design resources Design Assembly have shared as part of this series:
- ‘Sustainable Design Processes and the AI Feedback Loop’ by Alice Thompson
- AKL & WLG workshop: Taking advantage of AI to elevate your design practice
Ko wai u? Kia ora, my family whakapapa to Ngāti Raukawa, the Manawatū region. My father’s wananga are Wereta and my mother is Nes, from the Netherlands.
My own journey rediscovering what it is to be Māori started quite recently. I was away overseas for nine years. When I returned I saw a warming embracement of te reo Māori. After my daughter Scarlett was born, I became curious and eager to understand, discover and explore my heritage. What will I say to her when she asks me who she is?
I lead the research, design and content team at Springload. My background is mainly in digital research and design. It is through my work here, and the wonderful support of my colleagues, that I am now able to explore and understand what it is to be Māori.
With the race to adopt new technologies, everyone is sprinting to get their hands on AI to see what it can do. To see what efficiencies it can gain, to explore and experiment and see what is possible. We know these tools could shift our roles and challenge what we do each day, so we are seeking answers for what this new world looks like.
The designer in me feels excited for this future. Imagining these new AI tools, I like to call them “dreaming engines” — tools that give us the ability to reimagine even the most complex design problem. But as I dig further into how these tools are made, and what they can do, my instinct is calling me to question what I see.
My heart jumped when I discovered that ChatGPT somehow knew Māori. Joel Maxwell, a writer for Stuff, put it well “But here I am, being reassured in te reo Māori by a stranger with no face, no consciousness, no whakapapa, no body — but an apparent understanding of the indigenous language of Aotearoa. It claims to want to help me, but then it lies about its education. It claims to be sitting at its office desk at this very moment, which I know for a fact is untrue.”
Most technologies have a poor understanding of reo Māori. When I saw what ChatGPT can do, I felt a sense of grief and loss for my own language. Many Māori cannot speak te reo, and now ChatGPT can. Desperately, I wanted to know who had taught the engine. Where did it get our mātauranga?
I find it amazing that these tools have the ability to imitate and generate karakia or whakataukī. As Dr Karaitiana Taiuru PhD, JP, ACG, MInstD, RSNZ puts it, “I find it remarkable that…while rudimentary, ChatGPT has the ability to create a karakia for atua.”
I understand that our mātauranga is publicly accessible. However, the issue is how it has been taken and how it is transformed. What is the intention of the people who made these tools? Is it possible this is yet another page in the great story of colonisation?
The right to understand our culture and our language was taken from us. Thirty-five years ago, New Zealand adopted a law declaring Te Reo Māori, the language spoken by the country’s indigenous Māori people, an official language.
Decades of repression put our language under serious threat. Only one in four Māori spoke it by 1960, with a very low percentage of speakers among children.
Today there is a concerted effort underway to revitalise our language. How do tools made in America without us, like ChatGPT, help or hinder our journey?
Defining artificial intelligence
When thinking about artificial intelligence in the most basic sense, I consider how the corpus of data was made — including the sources it draws from, how the data is processed, and the output.
On top of that, we have the organisation or people who are making the tools, their intentions, and any biases they hold. A tool is a reflection of the values of the people who made it.
What is important to realise is that AI tools like ChatGPT are not just consuming information from the internet during the processing of the data. They also have the ability to hallucinate — to invent. In the context of exploring ideas — the dreaming engine — this could be a cool feature. But in the context of education and communication, given the likelihood of misinformation, hacking, and fake news, this could be a bug. What happens to our mātauranga in this context?
We are in a position to decide how we respond to a given AI tool based on who controls it, who made it, why they made it, how it works, who gains from it, and the principles and values that guided their practice.
We can also decide what we use these technologies for, and what practices we can put in place to ensure that they are built in a way that doesn’t perpetuate colonisation.
AI has the capacity to reinforce historical injustice and to cement forms of racial and gendered inequality. For these reasons an alternate set of values, paradigms, and priorities is urgently needed.
The whakapapa of data
As eloquently put by Whose Knowledge, “Structured data, the pieces of information that are easily read, understood, and processed by machines, is at the core of how the internet works… Yet, these databases present and organise elements in ways that are informed and structured by and around specific regulations, traditions, and epistemologies. As a result, in an attempt to categorise the world, they prescribe certain frames and worldviews.”
Issues of power and privilege are inherent in the ways knowledge is understood and the ways the internet is designed and experienced.
We know that the internet we have today is not multilingual enough to reflect the full depth and breadth of humanity, and most online knowledge today is created and accessible only through colonial languages.
If the information, models and structures for these tools are built on colonisation, what does that mean for how artificial intelligence could be inclusive?
Is it possible that because these models are built within a society that prioritises a certain culture, we could erase diversity?
Kia whakatōmuri te haere whakamua
In Māoridom we have a saying, “Kia whakatōmuri te haere whakamua: I walk backwards into the future with my eyes fixed on my past”.
Despite the dark nature of our past, Māori have long been strong protectors of our language. Michael Running Wolf, an anthropologist, expresses how he is inspired by our strength to defend our culture. Sadly, in some Native American cultures, the mother tongue is gone — or as he would put it, it has gone to sleep (10).
He looks at how Māori defended te reo via the idea of a language nest — the creation of a safe space where the language is spoken, with immersion camps and education. I am grateful for this protection. I am grateful they thought of our mokopuna. The protection of our mātauranga is now critical for this next chapter (10).
Towards indigenous AI – Te Hiku
Te Hiku are a small Māori non-profit media organisation. While building trust and accountability, they have developed a corpus of data that covers 30 years. The corpus’s rich resources include everyday interviews with people talking in conversational styles. This is now the largest archive of Māori language audio data in the iwi radio network.
Te Hiku are driven to revitalise indigenous languages to benefit communities. They have used machine learning to build language, speech recognition, speech synthesis, and real-time pronunciation models, ushering in a new generation of speech tools to empower Māori to ensure their language has a place in the digital world.
What is powerful and inspiring about what they have done is the beautiful intention behind it. Built into the values of their business, they connect with communities to build their tools. They have a Kaitiakitanga Licence that ensures they maintain accountability, and protect and take responsibility for the data.
“We’ve lost our sovereignty over our land. We want to make sure that our cultural knowledge, language and communities are not damaged in the same way as it was prior. It is critical for us to not further harm the language due to colonisation.” – Accelerating the revitalisation of te reo Māori with AI
Recently, Te Hiku were approached by an American technology company with an offer to purchase the data. Te Hiku rejected this and published a statement explaining that they wanted to retain indigenous knowledge and help revitalise the Māori language.
What I appreciate about Te Hiku is that I can see how the data was made, who made it, why they made it, how they made it, and the principles and values they stand by.
Towards indigenous AI – the five tests
Māori scholar Sir Hirini Moko Mead has developed a set of five tests that reframe the purpose and aim of AI tools according to Māori knowledge systems (8). His thought is that AI might benefit from indigenous wisdom. Moving away from the often-narrow western focus on utility and efficiency towards concepts of harmony with others, deeper ecological understanding, and close kinship networks, the five tests are:
- Tapu: Tapu provides a concrete antidote to universalisation. AI technologies must be designed with particular people and places in mind. It is both arrogant and insufficient to assume that one model is sufficient for a global audience.
- Mauri : The Mauri test is essentially a test of the risks to the life of the subjects.
- Take-utu-ea: Take-utu-ea is fundamentally about restoring balance, about making things right through an exchange of some kind.
- Precedent: This test looks for events in our traditions that might help us understand the issue and help frame a response. Kia whakatōmuri te haere whakamua: I walk backwards into the future with my eyes fixed on my past.
- Test 5.1 is whanaungatanga — connecting people.
- Test 5.2 is manaakitanga — your purpose, paying respect to others.
- Test 5.3 is mana — technology should not damage the mana of someone in how it is made, how it is used.
- Test 5.3 is noa — how do we shift technology to be in the realm of noa, make it everyday instead of novel?
- Test 5.5 is tika — “whether something is ethically, culturally, spiritually, and medically right.”
The framing of test four is particularly inspiring in terms of how we can reframe the way we operate in society today. Luke Munn draws on Sir Hirini Moko Mead’s ideas in his explanation of pūrākau:
One pūrākau states that there were three baskets of knowledge in the heavens that contain all of humanity’s knowledge. Tāne was sent to retrieve these baskets (kete), battling his older brother Whiro and overcoming obstacles to ascend through layers of heaven and retrieve the prized possessions. The kete-aronui contained knowledge that could help humans; the kete-tuauri housed the knowledge of ritual, memory and prayer; and the kete-tuatea held knowledge of evil which was harmful to humans. Karaitiana Taiuru (2018) argues that data is today’s knowledge basket, a container housing a rich treasure of information regarding all of life.
As in the narrative, this information is powerful, granting those who possess its particular advantages. Data is a resource, a treasure for the twenty-first century, but like other resources throughout history, it is one that is often dominated, controlled, or co-opted by colonisers. So, just like the story, this data should not be left to others but should be grasped or at least contested.
These tests and principles help guide us into the future we are about to embark on. They give us insight into a new way of thinking that will help us protect our culture and the futures of our mokopuna.
A way forward
It is clear that AI needs a new set of laws, regulations and considerations so that what we build is safe and good for our people. How will we be able to trust the sources of information we see?
Recently, a paper entitled “Pause Giant AI Experiments: An Open Letter” was published. This letter asks that we pause all development, particularly in generative AI, for six months — giving society time to catch up to these technologies. As the letter states, “powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable.”
When it comes to using these tools in our day-to-day lives, I ask you to consider what you are using them for. Think about some of the ethical and cultural implications of using them. Maybe it’s good to speed up writing a quick proposal or functional requirements as long as you read them, but be careful with anything that has a cultural component. Designing products for our people needs to be done by Māori hands.
I would ask companies to start considering how they might develop their own policy on how they use AI and what they use it for.
When thinking about making designs, understanding people’s needs, and solving problems for clients, consider what value you bring to the table. The value is not in the flashy tool, it is in the connections and relationships we hold and how we support each other to solve problems.
If this AI knows everything, what does it really know? We will only give it a voice and give it power when we use it.
The question is how we can release that power and build an indigenous perspective into what we do.
- Speaking my indigenous language with new AI
Joel Maxwell, 19 March 2023
- Chatbots need Indigenous Beta testers
Dr Karaitiana Taiuru PhD, JP, ACG, MInstD, RSNZ, February 17, 2023
- Decolonizing the Internet: What is it about
Whose knowledge, no date.
- Getty Images Statement
Getty images (no personal credit for information), Artur Debat (image) Jan 17, 2023
- Decolonizing the Internet 2018 Summary Report
Whose Knowledge? October 23, 2018
- Accelerating the revitalisation of te reo Māori with AI
AI for Good, 26 July 2022
- Te Reo Māori Speech Recognition
- The five tests: designing and evaluating AI according to indigenous Māori principles
Luke Munn, 16 February 2023
- Kaupapa Māori concept modelling for the creation of Māori IT Artefacts
Kevin Shedlock (Ngāpuhi, Ngāti Porou, Te Whakatōhea) & Petera Hudson (Te Whakatōhea)
- ChatGPT threatens language diversity. More needs to be done to protect our differences in the age of AI https://www.rnz.co.nz/news/on-the-inside/484062/chatgpt-threatens-language-diversity-more-needs-to-be-done-to-protect-our-differences-in-the-age-of-ai
Collin Bjork of The Conversation, 11 February 2023
- An Indigenous Perspective on Generative AI
Technology press, 29 January 2023
- ‘I follow the trail of blood’
Joanna Kidman | Feb 20, 2021
- OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic
Time magazine, Jan 18 2023
- Elon Musk, other tech leaders call for pause in ‘out of control’ AI race
Finn Hogan, CNN, 30 March 2023
- Pause Giant AI Experiments: An Open Letter
Future of life Institute
About the author:
Lauren Skogstad | Ngāti Raukawa | Head of Experience at Springload
Lauren leads the strategy, research, design and content practice and methodology at Springload. Lauren has a passion for creating meaningful and culturally authentic experiences that enrich people’s everyday lives. She believes in the power of strong partnerships, collaborative design and people based problem solving. Lauren has 16 + years experience across Wellington, London and Melbourne. She’s worked with ANZ, Adairs, Jetstar, RMIT, Holden and Wesley College to create beautiful digital experiences. For Springload, she’s worked on projects for the Ministry for Social Development, Tertiary Education Commission, New Zealand Customs Service, Engineering New Zealand, ACC, and Department of Internal Affairs.