Opinions expressed in AGB blogs are those of the authors and not necessarily those of the institutions that employ them or of AGB.
Boards are under pressure to keep up with artificial intelligence (AI) without losing control of data or oversight. The good news: Most already use digital tools and are curious about AI’s potential to make board work easier.
The AGB OnBoard 2025 Survey of Governance Professionals shows that boards can explore artificial intelligence (AI) safely by starting small. They use trusted platforms that simplify work such as summarizing minutes, preparing briefs, and searching archives.
The Big Picture: Curiosity Outpaces Structure
The survey results show clear momentum toward AI in governance, but also clear caution. Eighty-six percent of institutions use AI in some capacity, yet only 5 percent of boards have active AI pilots. Another 36 percent aren’t discussing AI at all.
As Jason Ferguson, AGB OnBoard account executive, explained, “AI is already reshaping higher education. Boards are expected to lead responsibly—but guidance is limited.” The data paint that tension in sharp relief: Trustees are intrigued by AI’s potential, but guardrails are thin.
Claire Norris, senior vice president for advancement, research, and strategy at the University of Louisiana System, summed up this moment neatly. “Most boards are just in the learning phase,” she said. “Healthy curiosity with some caution is right where we need to be right now.”
Who Responded to the Survey?
The findings reflect input from more than 120 governance professionals across colleges and universities of all sizes. They included small liberal arts colleges and large public systems.
Seventy-eight percent of respondents support governing or trustee boards, 19 percent work with foundation boards, and nearly all serve as board professionals managing meeting prep, compliance, and trustee communication.
These are the people who keep the wheels turning between meetings, and they’re seeing the pressure to adapt to AI firsthand.
The Pre-AI Baseline: Digital Readiness
Before we can talk about AI, we must talk about comfort with the technology. The survey found that staff show high digital fluency, while trustees’ comfort with new technology trails behind. That gap slows adoption of any new tool, AI included.
Norris was candid about the divide. “Our staff lives in digital tools,” she said. “Our board? Not so much.” Her team of 20 supports a system serving 84,000 students and 16 board members. Efficiency isn’t optional, but new technology only works when trustees are confident using it. That’s why Norris frames AI as a change management process, not just a tech rollout.
The State of AI in the Boardroom
While most institutions use AI somewhere on campus, that activity often happens outside the board’s direct oversight. Roughly one-third of boards are starting to talk about AI strategically. Another one-fourth are exploring basic administrative uses, and a small number are piloting tools.
For those not yet engaged, it’s not because they oppose innovation. It’s because they want to proceed responsibly. As Norris warned, “Curiosity without guardrails introduces an incredible amount of risk.” Her concern isn’t about AI itself but about timing. “We have to balance carefulness with the impatience of early adopters,” she said. “If we delay too long, we risk introducing AI through the back door, without policy.”
What’s Holding Boards Back
The biggest barriers are consistent:
- Security and data privacy
- Trustee understanding and comfort
- Compliance and regulation
Security ranked as the top concern for half of all respondents, far outpacing fears about job loss or complexity. Yet, about 69 percent of trustees said they already use AI informally—for note-taking, summarizing, or drafting. The gap between private experimentation and official policy is growing fast.
That’s why Norris advocates action rooted in governance. “We don’t want to create policy in a vacuum,” she said. “First, understand how people are already using AI, then layer the right guardrails on top.”
Where Boards See Real Value
The most common AI use cases in governance are pragmatic, not flashy:
- Summarizing meeting minutes
- Preparing pre-read briefs
- Searching policy archives
- Tracking board action items
Each task centers on reducing administrative friction so boards can focus on decision-making, not paperwork.
Norris connected the dots to her own experience. “Our board packets can be up to 800 pages,” she said. “Having AI highlight key points is critical. It doesn’t replace judgment. It helps us make better judgments.”
This focus on “assistive” AI aligns with the platform’s evolution. Ferguson noted that these same survey insights “led directly to the AGB OnBoard AI Suite—Book AI, Agenda AI, and Minutes AI—tools built around the workflows boards said mattered most.”
Four Building Blocks for Responsible AI Readiness
The research surfaced a four-step framework for boards to explore AI safely and confidently:
- Establish a governance-level AI policy. Start with principles: transparency, accountability, and scope.
- Collaborate early with IT and legal teams. Get security and compliance right from the start.
- Use trusted, compliant platforms. Look for SOC 2–certified, encrypted systems that already protect board data.
- Pilot low-risk, high-friction workflows. Begin with minutes, searches, or summaries, not sensitive decisions or personnel matters.
Norris emphasized starting with policy but pairing it with trustee education. “Policy highlights oversight, but education and security are equally critical,” she said. “One without the other leads to hesitation.”
Learning from Peers, Not Hype
When boards seek AI guidance, they turn to AGB, peer institutions, and internal legal and IT leaders. Norris emphasized that peer learning is essential. “We rely heavily on AGB for policy direction,” she said. “Seeing how other systems are approaching readiness helps us realize we’re not alone. We just need a starting framework.”
That collective learning culture runs deep in higher education, and it’s fueling a responsible, steady approach to AI adoption.
Three Elements of Ongoing AI Readiness
The AGB OnBoard team distilled the findings into three continuous priorities for board leaders:
- Secure what matters—Protect privacy, compliance, and ethics.
- Evaluate responsibly—Identify where AI can add true value.
- Educate continuously—Build trustee literacy and a shared understanding as technology evolves.
Norris urged boards to treat these priorities as a cycle, not a checklist: “Once we think we have it figured out, we need to start over again,” she said. “Security, evaluation, education—that cycle never stops.”
Next Steps: Turning Curiosity into Confidence
The 2025 survey shows boards are curious, cautious, and ready to learn. The path forward isn’t about rushing to adopt the newest AI. It’s about building literacy, structure, and confidence step by step.
Boards can start today by:
- Adding an AI discussion to the next board development session.
- Reviewing current data on governance and privacy policies.
- Creating an AI primer for trustees.
- Piloting AI for low-risk administrative tasks.
- Setting up feedback loops to refine policies as experience grows.
As Norris put it, “The genie’s out of the bottle. People are using it. The time to start is now.”
Curiosity got higher ed boards into this conversation. Confidence, structure, and shared learning will move them forward.
RELATED RESOURCES

Blog Post
Transforming Cabinet Meetings

Webinar On Demand
Boardroom Technology for Foundations

Trusteeship Magazine Article
Artificial Intelligence and the Future of Higher Education, Part 1
