On rolling out agentic AI: I can share the success formula that we’re following, which I think is relevant pretty much anywhere. It’s not easy, but it’s important, and it starts with the commitment. You can have a leader, and I’ll use a higher ed example: Michael Crow, president of Arizona State University, says let’s make this happen. People get in line and get it done. But that’s not very common. A lot of times it has to grow from a line of business leader in corporate America, or from within higher ed. And you need to make sure there’s buy in at that level. Then it has to be linked to the cause. And with agentic AI, because you’re using it to make what you’re doing more productive, you’ll find the way to do it right as the cause morphs over time. But you need to really think about your business. This should be an evolution for what you do. Don’t just do the things you do today faster or with more information. You should be doing something different or broader with it because it gives you that opportunity. So that’s where the execution phase comes in.
On AI governance: It’s critical, of course, because when I see AI governance under the data management office, I shudder because it’s really about the human experience. I’m not saying to put it under HR either, but it needs to report up at the level where you’re understanding the impacts to the human side of things. And humans can be the customers, employees, your stakeholders, or partners. It’s not about saving jobs. It’s about the human element and doing it right. All those pieces fall into that so you’d want the governance to be that way. That’s the model we’re following and we’re seeing success with it.
On attitudes to AI in higher ed: I like the maturity model that’s in place, and with a few tweaks, it could fit well in higher education. Another component of culture is needed, though, because the people piece is mostly around workforce development, which is critical. From my experience at corporate America, cultures change more flexibly in a positive way than in higher ed due to an attitude of preserving how things have always been taught, and how they want to do things. When I started teaching my class, I asked how many were using some form of gen AI or a copilot provided on campus, and I think two people raised their hands. But people were using DeepSeek underground because they’ve got other professors saying you’re not allowed to use it. But I say please use it. You need to learn to teach differently. That’s what instructors need to think about, and not just in higher ed. Anybody who’s growing, training, and leading staff needs to think about it in that way. Use it as a gift and not as a barrier.




