Skip to content
Back to articles

Cornell Professor Drops Written Exams Because of AI

April 9, 2026/3 min read/622 words
AI in EducationGenerative AIAI EthicsAI
NBC News interview with Cornell biomedical engineering professor Chris Schaffer discussing oral exams
Image: Screenshot from YouTube.

Key insights

  • AI didn't kill the problem set. It changed what the exam actually tests. The shift is from evaluating answers to evaluating understanding.
  • Six oral sessions per semester means more structured one-on-one contact between students and staff than most courses offer in total.
  • Students who were nervous going in ended up preferring the oral format by the end of semester. Being genuinely evaluated feels better than submitting work no one can verify.
Published April 8, 2026
NBC News Daily
Hosts:Morgan Radford, Vicky Nguyen
Cornell University
Guest:Chris SchafferCornell University

This is an AI-generated summary. The source video may include demos, visuals and additional context.

Watch the video · How the articles are generated

In Brief

A Cornell University engineering professor has switched from written homework to oral exams — not because oral exams are trendy, but because AI can now solve his problem sets nearly perfectly. Chris Schaffer, who teaches at Cornell's Meinig School of Biomedical Engineering, made the call in August 2025 after running his assignments through generative AI and watching it ace them. The solution: students still do the work at home, but the grade now depends on whether they can explain it out loud to a human.

The moment everything changed

Schaffer has been teaching engineering courses in the traditional way that many science and math classes run: students receive a problem set (a collection of challenging homework problems they work on over several days) and submit their solutions. The submitted work is then graded, and that grade counts.

Since generative AI (the kind of AI that can write, calculate, and reason, like ChatGPT) became widely available, Schaffer had been quietly testing it each year against his own assignments. It was in August 2025 that the results shifted decisively: generative AI produced near-perfect solutions. Not good enough to be suspicious. Actually near-perfect.

At that point, Schaffer realized written homework no longer showed what students actually knew. If a student submits a correct solution, there is simply no way to tell whether the student understood the problem or just handed it to an AI. He needed a format where AI can't answer for you.

How oral exams work in practice

Schaffer found a middle ground. Students still receive and submit written problem sets, and that part of the course is unchanged. But the written submission is no longer graded. Instead, after submitting, each student schedules a short session with a member of the teaching staff.

In that session, the student is asked to explain choices they made in their own solution, answer conceptual questions about the topics covered, or solve a similar problem right there on the spot. The exam itself runs 20 to 30 minutes, and the grade is based entirely on how the student performs in that conversation.

The logic is straightforward: you can paste a problem into ChatGPT, but you cannot paste the follow-up question from a professor who is sitting across from you and already looking at your answer.

A side effect nobody expected

When Schaffer looked at final exam results after the switch, student performance was essentially the same as the previous year. The new format didn't hurt learning. But it did produce something extra.

With six problem sets spread across the semester, the course now guarantees six one-on-one conversations between each student and a member of the teaching staff — each one 20 to 30 minutes long, focused entirely on that student's understanding. In most university courses, a student can go the entire term without a single structured conversation with an instructor about how they are actually doing.

Oral exams didn't just keep the AI out. They accidentally created a bonus nobody planned for: regular, personal check-ins with every student.

Students came around

Schaffer expected resistance. Oral exams sound scary, especially for engineering students used to working alone. He expected nerves, and he got them — for the first session. But by the end of the semester, both students and teaching staff preferred the new format over the old one.

That's worth thinking about. Oral exams are harder to fake. You have to actually understand the material, talk about it, and think under a bit of pressure. Yet students preferred it. Maybe the old format felt hollow in ways they noticed but never said out loud.

Glossary

TermDefinition
Problem setA collection of homework problems that students work on at home over several days, then submit. Common in science and engineering courses.
Generative AIAI that can produce new content (text, code, calculations, solutions) in response to a prompt. Examples include ChatGPT and similar tools.
Oral examAn exam where the student answers questions by speaking face-to-face with a teacher or examiner, rather than writing on paper.

Sources and resources

Share this article