"

AI and Indigenous Perspectives

Indigenous scholars and communities have been at the forefront of raising critical questions about AI that extend beyond technical considerations to fundamental questions about knowledge, relationships, and responsibility.

Indigenous data sovereignty principles challenge assumptions about who can access, analyze, and benefit from information. When AI models are trained on Indigenous knowledge without consent or attribution, they perpetuate colonial extraction practices in digital form. When institutions adopt AI tools that treat all knowledge as equivalent and extractable, they risk undermining Indigenous epistemologies that emphasize relational and contextual ways of knowing.

Practical implications for teaching and learning leaders:

  • Review this research paper, First Nations and Artificial Intelligence, for an overview of specific risks
  • Consult with Indigenous faculty, staff, and students before implementing AI tools that might interact with Indigenous knowledge or content.
  • Consider Indigenous pedagogies when using AI to develop course material.
  • Include Indigenous perspectives in AI governance structures, not as an afterthought but as foundational to decision-making.
  • Consider how AI policies align with or conflict with Indigenous protocols around knowledge sharing and attribution.
  • Support professional development that helps faculty understand Indigenous approaches to AI ethics, such as the Indigenous Perspectives on AI course mentioned in our resources section.

 

Examine how Indigenous knowledge frameworks can challenge Western AI assumptions and reshape how intelligence is defined, designed, and governed.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

AI Playbook for Teaching and Learning Leaders: A Community Guide Copyright © 2025 by Erin Aspenlieder and Sara Fulmer is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.