"

Leading Institutional Governance

What, Why and How

We have artificially separated the development of Guidelines from the development and implementation of an AI governance structure. In truth, your institution could decide that developing and implementing the governance structure ought to come first, and that Guidelines will be developed by the relevant committees, working groups or individuals.

Effective AI governance creates space for the difficult conversations your institution needs to have: what kind of learning do we value? How do we balance efficiency with educational depth? Who should make decisions about tools that impact both faculty autonomy and student experience? These conversations take time and require inclusive structures that can’t be rushed.

The governance challenge is particularly acute because AI intersects with nearly every aspect of institutional life. Unlike implementing a new learning management system (LMS) or updating accessibility policies, AI governance cuts across academic affairs, IT, legal, privacy, research, and student services. This breadth means that traditional governance structures may be insufficient.

Setting up committees with appropriate governance, terms of reference and membership necessarily takes time. For the purposes of the development of Guidelines we’d recommend instead creating an “ad hoc group” that can rapidly create Guidelines while the more deliberate work of governance is determined.

Across Canada, there are several models of institutional governance. Here we’ll outline some common models, and include examples from both colleges and universities where we can. We hope you’ll add additional examples or models that we can all learn from.

The section below was initially drafted using Claude 3.7 with editing for tone and structure, as well as verification of claims and links, and then additions of further known examples.

Distributed Faculty Leadership Model

Some institutions have chosen to embed AI governance within existing faculty-led structures, treating AI decisions as fundamentally academic rather than administrative. This approach recognizes that faculty have primary responsibility for curriculum and pedagogy, and that AI governance should flow through academic senates, curriculum committees, and department-level decision-making.

The strength of this model lies in its alignment with traditions of academic freedom and faculty governance. Decisions about AI tools and policies emerge from disciplinary expertise and pedagogical judgment rather than top-down mandate. Faculty buy-in may be higher because the governance process respects existing academic authority.

The challenges are coordination and consistency. Faculty-led governance can produce thoughtful but fragmented approaches, with different departments reaching different conclusions about similar issues. It also tends to move slowly, which may frustrate urgent implementation needs.

Executive Leadership Model and/or Special Advisor Model

Some institutions have chosen to place AI governance responsibility directly with senior leadership by creating a dedicated executive-level position focused on AI strategy and oversight, or by creating roles focused on guiding and leading AI efforts across the institution.

Listen to Western University’s Chief AI Officer, Mark Daley in a podcast episode of AI Dialogues, talking about his role.

This model signals strong institutional commitment by placing AI leadership at the highest level and/or by allocating resources for a specific AI-focused role. One advantage of this model is the clear leadership accountability, as responsibility is concentrated in a single position, and can likewise support faster decision-making.

Examples: Western University’s Chief AI Officer, Queen’s Special Advisor on Generative AI, McMaster’s Special Advisor on Generative AI, University of Windsor’s Institutional Advisor on AI

Advisory Committee or Task Force Model

The most common approach we’ve observed is establishing dedicated advisory committees or task forces focused on AI that report to senior leadership. The strength of this model lies in its inclusive approach that ensures AI governance encompasses multiple perspectives and institutional priorities. By creating domain-specific subcommittees, it allows for targeted expertise to be applied to different areas of implementation.

Listen to a podcast episode from AI Dialogues with U of T’s Susan McCann talking about the U of T Task Force.

However, advisory committees may move more slowly than executive-led models due to consultation expectations, and their effectiveness depends heavily on committee composition and engagement. There’s also a risk of creating recommendations that aren’t actionable if the committee lacks direct connection to implementation resources.

Example: McMaster University established an Artificial Intelligence Advisory Committee reporting to three co-sponsors (the Provost, Vice-President Research, and Vice-President Finance) with domain-specific expert panels addressing different aspects of AI implementation. Similarly, UBC created a Generative AI Steering Committee bringing together academic and functional leaders from across the university.

Standing Committee Integration Model

Rather than creating entirely new governance structures, some institutions have integrated AI governance responsibilities into existing standing committees. This approach leverages established governance pathways and ensures AI considerations are evaluated alongside related technological and pedagogical concerns. It typically requires fewer additional resources than creating new structures and may face less institutional resistance. The downside is that AI considerations might receive insufficient attention among competing priorities, and existing committee membership may lack specialized AI expertise. There’s also a risk of perpetuating siloed approaches if the standing committee has a narrow focus.

Example: The University of Waterloo addresses AI governance through its “Standing Committee on New Technologies, Pedagogies, and Academic Integrity,” convened by the Associate Vice-President, Academic. This committee’s mandate includes monitoring emerging tools, recommending pedagogical adaptations, and proposing policy modifications.

Research Center-Led Model

Some institutions are using their specialized AI research centers to lead or support governance efforts.The advantage of this model is that it leverages specialized technical expertise already existing within the institution and creates natural pathways between academic research and institutional policies. However, it may overemphasize technical aspects at the expense of broader institutional considerations, and research centers may lack administrative experience or authority for policy implementation. There’s also a risk that the focus might skew toward research applications rather than teaching and learning.

Example: Seneca Polytechnic’s Centre for Innovation in AI Technology (CIAIT) supports small and medium-sized organizations in adopting AI technologies while also informing institutional practices. While not solely responsible for institutional governance, specialized centers like CIAIT provide crucial expertise that informs policy development.

Professional Development-Driven Model

Some institutions, particularly colleges, have focused on building AI governance through professional development initiatives. This approach builds capacity and shared understanding across the institution, creating a community of practice that establishes common principles for AI use. It can often be implemented with modest resources compared to structural changes and encourages bottom-up engagement. The challenge is that professional development initiatives may lack formal authority to enforce policies, are heavily dependent on voluntary participation, and can lead to uneven implementation across institutional units.

Example: Humber College’s Innovative Learning team created an AI Institute as a collaborative professional learning experience, focusing on understanding AI in higher education while developing institutional guidelines. Humber has also convened stakeholders to develop support for responsible AI use grounded in evidence-based practices.

Choosing Your Governance Approach: Key Questions

Rather than prescribing a single “best” model, consider these questions as you evaluate what might work at your institution:

What decision-making traditions does your institution value? Institutions with strong faculty governance cultures may resist top-down AI mandates, while institutions facing urgent external pressures may need more centralized approaches.

Where does AI expertise currently reside? If your institution has strong computer science or educational technology units, research center-led models may make sense. If expertise is distributed, advisory committee approaches may be more appropriate.

What resources can you realistically sustain? Executive positions require one-time or ongoing budget commitments. Advisory committees require staff support and member time. Community-building approaches require facilitation skills and patience.

How urgent are your AI decisions? If your institution faces immediate compliance requirements or significant AI adoption pressure, faster-moving governance models may be necessary. If you have time for deliberative processes, faculty-led or community-building approaches may produce more sustainable results.

What authority do you actually need? Some AI decisions (like tool procurement) require administrative authority. Others (like pedagogical guidelines) may be more effective when emerging from academic governance. Match your governance structure to the decisions you actually need to make.

Common Elements in Successful AI Governance Approaches

Regardless of the model you adopt, we’ve observed several common elements in successful AI governance approaches. As a teaching and learning leader you are in a position to both recommend and advance one of these governance models. Identify what you think would work best in your institutional context and advocate for its creation – or create it yourself!

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

AI Playbook for Teaching and Learning Leaders: A Community Guide Copyright © 2025 by Erin Aspenlieder and Sara Fulmer is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.