Skip to Main Content

AI is the environment.
Structure is the advantage.

How Ivey faculty are redesigning case-based business education for an AI-enabled world.

At first glance, it looks like a typical Ivey class: HBA students working through a marketing case. But this time, they are not doing it the traditional way.

Each exercise allows only three prompts. There are time limits – the clock is ticking.

Created by Guneet Kaur Nagpal, assistant professor of marketing at Ivey and one of Western University’s Generative AI Teaching Fellows, the exercise – called plAIbook – serves two purposes: exposing students to the power of AI while also teaching them to question it.

Kaur Nagpal wants her students to see firsthand that the technology can produce a polished marketing plan in seconds. It can sound confident – but that confidence doesn’t necessarily mean the strategy is sound.

“One aspect of AI is that it sounds very smart – very confident. It may use positive words – confident words – and students may not be able to critically question the assertiveness,” she said. “By giving them the verification alerts and prompt guidance – the structure – we are allowing the students to question what is being said.”

Kaur Nagpal summarizes the approach this way:

"AI may increasingly become the environment students work in, but structure remains the advantage educators provide."

What is happening in Kaur Nagpal’s classroom reflects a broader shift underway at Ivey.

Across the School, faculty members are no longer debating whether students will use AI – they know students already are. Instead, the focus has shifted to how the technology can be integrated into business education in ways that deepen learning rather than undermine it.

From rethinking case discussions to building AI-powered simulations and AI tutoring platforms, faculty are experimenting with ways to prepare students for a world where AI will shape many of the decisions they make as leaders.

Many of these tools are still being tested in the classroom, but they are already beginning to reshape how Ivey students learn.

A woman looking at a screen with an AI image on it

In a Canadian first for business education, Ivey launched Cloudforce’s nebulaONE®, providing faculty, staff, and students with secure, community-wide access to a leading enterprise generative artificial intelligence platform.

Avoiding the AI shortcut

For Mazi Raz, MBA ’05, PhD ’14, an assistant professor of strategy, the shift became noticeable when students began arriving at class with AI-generated summaries of the case. On the surface, they appeared prepared, speaking confidently early in discussions. But when pressed to defend their reasoning, the limits of that preparation became clear.

For a school built around the case method – where students are expected to wrestle with ambiguous problems before class – the implications are significant.

In ongoing peer-reviewed research, Raz and Ivey Dean Julian Birkinshaw describe the phenomenon as “algorithmic short circuiting” – when generative AI bypasses the productive struggle required for deep learning. That struggle, Raz said, is essential for building both conceptual understanding and practical skills.

“We're not just talking about knowing something, but also having a skill of doing something,” he said. “There’s a risk that AI bypasses the whole reading process. You get a very decent answer, but you may not have built the skill.”

Raz describes the learning process in three stages: engaging with the material independently, refining ideas in a social classroom setting, and reflecting afterward – a progression that echoes well-established theories of experiential learning.

That final reflection stage becomes even more important in an AI-enabled classroom, where students must critically evaluate not only their own thinking, but also the outputs generated by the tools they use. Raz said this places greater responsibility on faculty to be deliberate about how they design the classroom experience.

Raz said case teaching at Ivey is – and should remain – fundamentally social.

"Students learn by debating ideas, challenging each other, and defending their reasoning in the classroom."

“That interaction is a critical part of the learning process,” he said.

Raz also uses his classes to demystify AI itself – discussing how large language models are trained, the labour involved in producing them, and their environmental impact.

But he is not opposed to AI in the classroom.

Sometimes, he introduces it as what he calls the “79th student” in a class of 78. He displays an AI-generated response to a case question and asks the class to critique it.

Where are the assumptions?
What context might be missing?
When would this reasoning fail?

The goal, Raz said, is not to prevent students from using AI, but to teach them how to question it.

Rebuilding the case for an AI world

While Raz is rethinking the philosophy of case teaching, other faculty members are experimenting with how the structure of cases themselves might evolve in an AI-enabled classroom.

Kyle Maclean, HBA ’12, PhD ’17, and Tiffany Bayley, assistant professors of management science, are exploring how generative AI can reshape the way students use information in a case.

In a recent pilot in their HBA1 Decision Making with Analytics course, they developed a large language model-based case simulation that changes how students interact with information.

Traditionally, students receive a case with a full set of exhibits and financial information. The analysis begins with everything on the table.

In this version, students start with only a small amount of information and must decide what additional data they need and request it. They then interact with a simulated AI analyst that responds to their questions and provides data on request.

Instead of simply analyzing information, the students must first decide what information is worth analyzing.

“Their whole role was to figure out what data they needed to address the question,” said Maclean.

Students can request additional information, ask a simulated analyst some questions, and explore different scenarios. Some of the information proves useful – some does not.

In some cases, students pursue lines of inquiry that do not lead to useful insights. That, too, is part of the learning. The simulation is designed to expose students to these false starts and help them understand why certain questions are more effective than others.

“This mimics what happens in real life,” said Bayley.

"We rarely have perfect information. Managers need to decide what data they need and what questions to ask."

The exercise is designed to make students more deliberate about their questions – a skill that becomes more important as AI makes information easier to generate.

It also helps students build what Maclean described as the “muscle” of asking better questions – developing precision and judgment in how they frame problems.

Maclean said part of the goal in his classroom is to demystify AI.

"When people see what AI can do, there’s a bit of a sense of magic around it," he said.

Maclean has also experimented with embedding an AI-generated daily news tool into Canvas, Ivey’s online learning platform for course materials and assignments. The tool identifies relevant news events and connects them to what students are learning

Each day, the system scans the news and finds articles connected to what the students are learning. By linking current events to course concepts, the tool helps students to see how ideas from the classroom apply to real-world decisions.

The goal is not to replace the teaching, but to evolve it.

Maclean and Bayley are continuing to refine the simulation, exploring how future iterations could further challenge students to think critically about both the questions they ask and the information they receive.

A digital Aristotle

While some faculty are redesigning classroom structure, Joshua Foster, an assistant professor of business, economics, and public policy, is extending learning beyond it.

Foster developed Sidekick, an AI tutoring platform that helps students learn by asking questions and giving feedback in real time. He has used the platform in both HBA and MBA elective courses, testing how it performs across different student groups and learning contexts.

His inspiration comes from research by educational psychologist Benjamin Bloom showing that one-on-one tutoring can dramatically improve performance.

"I think AI is the way that we give every student Aristotle," he said, referring to the philosopher who famously tutored Alexander the Great.

Unlike open-ended tools like ChatGPT, Sidekick is faculty-controlled and limited to faculty-provided materials. Students cannot simply request answers. Instead, the system prompts them to explain their reasoning before moving forward.

In its first rollout, 80 per cent of MBA students opted in. The platform delivered 250 tutoring sessions, with students spending an average of 20 minutes per session. Nearly half the sessions took place between 6 p.m. and 2 a.m., indicating students were using the tool when traditional support was unavailable.

The initial analysis focused on MBA students, where the tool was used to support exam preparation.

Early analysis suggests a relationship between Sidekick use and exam performance, although Foster cautions that the results are not from a controlled experiment. Among students who used the platform, each tutoring session was associated with an average increase of 0.55 points on the final exam. Every 100 minutes of use corresponded with an average increase of 2.77 points.

Foster warns the results do not prove cause and effect, since participation was voluntary, but even with those caveats, the early results are encouraging. He plans to conduct follow-up focus groups with students and explore how the platform could be adapted as a broader resource for faculty.

More broadly, Foster said AI will reshape education, but not eliminate the need for judgment.

He also sees tools like Sidekick as a way to help prepare students for a changing job market, where the ability to think critically alongside AI will be essential.

"Students are going to have to be good discriminators on what the model should be doing and what it should not be doing," said Foster.

Teaching AI fluency at scale

If Sidekick supports learning between classes, Kaur Nagpal’s GenAI simulation transforms learning within them.

During this term, 800 HBA students are using the simulation.

Students work through 11 structured exercises aligned with a marketing framework, moving from company, category, competitor, and customer analysis to a final recommendation. They cannot skip ahead.

Five built-in nudges reinforce critical thinking:

  • Structured learning paths;
  • Prompt scaffolding with limited attempts;
  • Prompt support modules;
  • Time constraints; and,
  • Verification alerts encouraging students to double-check their work.

Rather than allowing unlimited interaction with AI, the system introduces constraints that encourage students to think more carefully about how they use it.

Learning to work with AI

For HBA student Lily Abboud, the exercise revealed something she had not previously considered when working with generative AI.

“Working with AI helped us to go through a trial-and-error process and fix our work as we went through it,” she said. “I realized how much changing your prompt can change the result you get.”

Kaur Nagpal developed the simulation in response to what researchers call “metacognitive laziness” – the erosion of critical thinking when mental effort is outsourced to AI. For experienced faculty, AI’s confidence is easier to question. For students encountering these tools for the first time, it may not be.

“As instructors, we have years of experience questioning ideas,” Kaur Nagpal said. “Students may not yet have that instinct. So the structure of the simulation helps build that habit.”

The goal is not to remove AI from the process, she says, but to teach students how to work with it while maintaining ownership of their decisions.

For Cameron Veisman, another HBA student in the class, the structure forced a more deliberate approach to using AI.

“Most students probably use AI for case prep,” he said. “But this forced you to think twice about how you’re using it and revise your prompts instead of just taking whatever response you get.”

The rollout also includes a research component. Different sections experience variations of the simulation to examine how its structure influences learning and student thinking.

The goal is to better understand how students interact with AI tools – and how design choices can encourage deeper thinking instead of shortcuts.

Ivey Assistant Professor Guneet Kaur Nagpal works with HBA student Lily Abboud during a generative AI simulation designed to help students question – not just use – artificial intelligence.

Ivey Assistant Professor Guneet Kaur Nagpal works with HBA student Lily Abboud during a generative AI simulation designed to help students question – not just use – artificial intelligence.

Businesswoman using Chatting AI chatbot virtual assistance app on smartphone to assist her while working on laptop in office park

Students are increasingly using generative AI tools to support their learning – raising new questions about how to develop critical thinking and judgment in the classroom.

Reimagining experiential learning

Classroom experimentation is complemented by co-curricular initiatives such as the recent MBA AI Workshop and Hackathon, led by Associate Professor Fredrik Odegaard, where students experimented with AI tools and explored their ethical and strategic implications.

Together, these initiatives reflect Ivey’s broader commitment to integrating AI across teaching, research, and operations. The efforts align with Ivey’s Bold Ambition strategy, particularly its pillar to reimagine experiential business learning for the world.

When Ivey refreshed its vision, the role AI would play was still emerging. That connection is now becoming clearer.

“It’s clear to me now that AI is a huge impetus for reimagining experiential business learning for the world,” said Dean Julian Birkinshaw at a recent school-wide AI launch.

For Birkinshaw, the challenge is not simply adopting new tools, but ensuring Ivey helps lead the evolution of case-based learning in an AI-enabled world. He says it is vital that Ivey rethinks how case-based learning works – not just for the School, but for business education more broadly.

That leadership extends beyond students. It includes faculty research on how AI shapes markets and organizations, operational experimentation across the School, and partnerships with AI experts through the AI Fellows initiative.

Overall, Birkinshaw said Ivey aims to become an AI-enabled organization that models the leadership it seeks to develop.

Why judgment still matters

Students graduating today will enter workplaces where AI drafts reports, analyzes data, and proposes strategies.

Their advantage will lie in understanding when to trust the technology – and when to challenge it.

Raz said that responsibility extends beyond the classroom. Business schools, he said, also have a role to play in helping organizations learn how to use AI thoughtfully and responsibly.

“The presence of AI in education gives Ivey another level of responsibility,” he said.

"How do we partner with organizations to help them integrate AI in a meaningful way? It makes Ivey an advocate for using AI in ways that support growth and development – not replace human judgment."

Connect with Ivey Business School