These schools don’t fear artificial intelligence. They put it to work
A year and a half ago, in the wake of ChatGPT’s enormously popular release, instructors at every educational level rushed to understand how the coming onslaught of artificial intelligence might transform their field, even disrupt it. “Folks in higher education were absolutely scrambling,” says Andrew Maynard, professor in the School for the Future of Innovation in Society at Arizona State University, in Phoenix (No. 40 on Ignition Schools ‘24). “It was almost like organized panic.” But before long, at innovation-focused programs across the academic spectrum, panic gave way to experimentation and even optimism. With no playbook to follow, forward-thinking educators from across the country improvised—not just by replacing assignments that ChatGPT could solve with ones it could not but also by updating their curricula to teach practical uses for the technology in the classroom. Meet Your New Teaching Assistant It quickly dawned on ASU’s Maynard that we were entering a new era of work life in which workers in nearly every industry must possess radically new skills to survive—including effectively deploying AI and adapting alongside the technology. So he took to—where else?—ChatGPT and in a matter of hours developed the syllabus for an online course on the art and science of instructing large language models, known as prompt engineering. At first, students were ChatGPT skeptics. “They were saying, ‘This technology doesn’t work, this technology hallucinates, it’s biased, I don’t see what I can use it for,’” he recalls. “I was thinking, What have I done?” But by the end of the course, students “went away having a clear idea of how to apply this tool within their specific setting, no matter where they were coming from, and with just a few exceptions.” Maynard deems his (and ChatGPT’s) course a success; he also saw a growth opportunity. “We are very actively looking at how you can scale and accelerate and expand learning environments using generative AI,” he says with the animated ring of a founder. AI Leaves the Classroom Researchers at Georgia Institute of Technology (No. 27 on Ignition Schools ‘24) pioneered the development of automated tutoring in 2016 with the release of Jill Watson, an AI-powered teaching assistant trained on data from trusted, professor-approved sources such as lecture notes, slides, and textbooks. While the original model, which ran on IBM’s Watson, was impressive enough, the latest, ChatGPT-powered version marks a huge leap forward—and now, according to computer science professor, director of Georgia Tech’s Design Intelligence Laboratory, and Jill Watson inventor Ashok Goel, “can answer questions based on any document you want.” Goel and his colleagues are also experimenting with AI as a community-building tool in a project dubbed SAMI, which parses students’ biographical information to pair classmates who might get along. Learning, after all, is a social process as much as it is an educational one. “We’ve found empirical evidence that SAMI helps improve learners’ sense of social belonging,” Goel says, “and that is critical for emotional and cognitive health.” Rubber Duck Debugging Engineer it with the right prompts, and you can tailor AI chatbots that serve as coaches or tutors, disinclined simply to complete students’ homework for them. Last year, David Malan, who teaches Intro to Computer Science to undergraduates at Harvard University (No. 4 on Ignition Schools ‘24), began experimenting with an AI chatbot that offers coding advice and help with the underlying theory. He dubbed it the duck debugger, or ddb for short—a reference to rubber duck debugging, a method of fixing logic errors by trying to explain them in detail to an actual rubber ducky (or other inanimate object). Students are limited in how frequently they can query ddb, both to prevent a dependency and to keep Harvard’s cloud computing costs in check. While it’s often assumed that online courses are the most promising avenue for AI’s use in educational settings—that students with limited or no in-person instruction will benefit most from such on-demand feedback and guidance—Malan says that even on-campus students with access to office hours appreciate “the duck,” as it’s affectionately become known. An important factor is the 24/7 access—one-on-one chats at any hour to work through hard problems, something that not even the most dedicated human instructor can satisfy. Chatbots, after all, don’t require office hours. This story is part of Fast Company and Inc.’s Ignition Schools 2024 awards, the 50 colleges and universities making an outsize impact on business and society through entrepreneurship and innovation. Read about the methodology behind our selection process.
A year and a half ago, in the wake of ChatGPT’s enormously popular release, instructors at every educational level rushed to understand how the coming onslaught of artificial intelligence might transform their field, even disrupt it. “Folks in higher education were absolutely scrambling,” says Andrew Maynard, professor in the School for the Future of Innovation in Society at Arizona State University, in Phoenix (No. 40 on Ignition Schools ‘24). “It was almost like organized panic.”
But before long, at innovation-focused programs across the academic spectrum, panic gave way to experimentation and even optimism. With no playbook to follow, forward-thinking educators from across the country improvised—not just by replacing assignments that ChatGPT could solve with ones it could not but also by updating their curricula to teach practical uses for the technology in the classroom.
Meet Your New Teaching Assistant
It quickly dawned on ASU’s Maynard that we were entering a new era of work life in which workers in nearly every industry must possess radically new skills to survive—including effectively deploying AI and adapting alongside the technology. So he took to—where else?—ChatGPT and in a matter of hours developed the syllabus for an online course on the art and science of instructing large language models, known as prompt engineering.
At first, students were ChatGPT skeptics. “They were saying, ‘This technology doesn’t work, this technology hallucinates, it’s biased, I don’t see what I can use it for,’” he recalls. “I was thinking, What have I done?” But by the end of the course, students “went away having a clear idea of how to apply this tool within their specific setting, no matter where they were coming from, and with just a few exceptions.” Maynard deems his (and ChatGPT’s) course a success; he also saw a growth opportunity. “We are very actively looking at how you can scale and accelerate and expand learning environments using generative AI,” he says with the animated ring of a founder.
AI Leaves the Classroom
Researchers at Georgia Institute of Technology (No. 27 on Ignition Schools ‘24) pioneered the development of automated tutoring in 2016 with the release of Jill Watson, an AI-powered teaching assistant trained on data from trusted, professor-approved sources such as lecture notes, slides, and textbooks. While the original model, which ran on IBM’s Watson, was impressive enough, the latest, ChatGPT-powered version marks a huge leap forward—and now, according to computer science professor, director of Georgia Tech’s Design Intelligence Laboratory, and Jill Watson inventor Ashok Goel, “can answer questions based on any document you want.”
Goel and his colleagues are also experimenting with AI as a community-building tool in a project dubbed SAMI, which parses students’ biographical information to pair classmates who might get along. Learning, after all, is a social process as much as it is an educational one. “We’ve found empirical evidence that SAMI helps improve learners’ sense of social belonging,” Goel says, “and that is critical for emotional and cognitive health.”
Rubber Duck Debugging
Engineer it with the right prompts, and you can tailor AI chatbots that serve as coaches or tutors, disinclined simply to complete students’ homework for them.
Last year, David Malan, who teaches Intro to Computer Science to undergraduates at Harvard University (No. 4 on Ignition Schools ‘24), began experimenting with an AI chatbot that offers coding advice and help with the underlying theory. He dubbed it the duck debugger, or ddb for short—a reference to rubber duck debugging, a method of fixing logic errors by trying to explain them in detail to an actual rubber ducky (or other inanimate object). Students are limited in how frequently they can query ddb, both to prevent a dependency and to keep Harvard’s cloud computing costs in check.
While it’s often assumed that online courses are the most promising avenue for AI’s use in educational settings—that students with limited or no in-person instruction will benefit most from such on-demand feedback and guidance—Malan says that even on-campus students with access to office hours appreciate “the duck,” as it’s affectionately become known. An important factor is the 24/7 access—one-on-one chats at any hour to work through hard problems, something that not even the most dedicated human instructor can satisfy. Chatbots, after all, don’t require office hours.
This story is part of Fast Company and Inc.’s Ignition Schools 2024 awards, the 50 colleges and universities making an outsize impact on business and society through entrepreneurship and innovation. Read about the methodology behind our selection process.