Upper School (US) students caught illicitly using AI now face an extended stay in the Quiet Room, or, as the Disciplinary Committee (DC) calls it, an in-school suspension. These students are not allowed to go to class unless their teachers specifically require them, and they complete their classwork asynchronously.
Adjusting to the increased availability of generative AI, US faculty are deciding how to balance transparency with protecting student privacy. Recently, students have received in-school suspensions for AI use.
A sophomore, who requested anonymity to protect his privacy, received an in-school suspension after his English teacher noticed copy-and-pastes in his document.
“It was just for a mini deadline, which was like two body paragraphs,” he said. “At the time, I had one-and-a-half body paragraphs, and I just wanted to get up to that two, so I used AI to fill the blank, basically, and I was planning on deleting it later.”
The student experienced three days of an in-school suspension.
“Honestly, I think for a punishment, it’s not really a punishment,” he said. “You essentially have a study hall for three days straight, where you’re able to meet with teachers as well, and if teachers ask for you to be present in a class, you’re required to go to class again.”
Though the anonymous student understands that AI is a punishable offense, he’s unsure if in-school suspensions are the correct solution, he said.
“I think academic probation is more than enough, where it gives you a warning, but it makes it so that your next offense is punished much more severely. An in-school suspension just puts you behind on your subjects for the most part and just bores you out.”
Instead, more transparency on AI use would be beneficial, he said.
“The DC just says you’ll be punished and stuff. They don’t really specify what the punishment is. And I think if they do specify that previously, it could probably help reduce the number of cases that they get on AI.”
DC member Caroline Dudzinski ’26 has seen a plethora of AI cases brought to the committee.
“I haven’t really seen anything else this year, which is disheartening but understandable because AI is so prevalent nowadays.”
Infractions relating to AI academic dishonesty fall into a gray area, Caroline said.
“The message is ‘Don’t use it,’ but especially when writing papers, the rules can get confusing. I think that it would help if the school made it very clear, like posted in a public place: ‘These are the rules. Don’t do it.’”
The DC treats AI academic dishonesty as plagiarism, DC Head Alda Farlow said.
“I’m seeing the same amount of academic dishonesty, just with a different tool. Plagiarism leads to disciplinary committee hearings. The issue isn’t using AI: It’s plagiarizing.”
Drawing from her high school experience, Ms. Farlow would support increased information on the consequences of using AI, she said.
“While students’ privacy wasn’t fully protected, the rest of us learned from their mistakes. A transparent system holds the community accountable for students.”
Although there is currently no written policy on AI usage, the school is working on a general policy about assistance, US Director Jessica Keimowitz said.
“If we only talk about AI, it’s actually too narrow. We need to think about, ‘How do we help students create their own work, where is it okay to ask for help, where is it not okay to ask for help? And then how does AI fit into that?’”
US English Teacher Dave Scrivner isn’t focused on “catching cheaters,” he said. “We just want students to understand the opportunities they lose and miss out on.”
In recent years, the English Department created the “One Doc policy” (See Vol 53, Issue 4: “Students discuss AI amid changing policies”), encouraging students to submit their own work, Dr. Scrivner said.
“The main thing that the English Department is trying to do now is convey to students how important moments of challenge are in their development. No challenge, no growth. And we don’t want to graduate students from this school who missed really important steps because as the work gets harder, those foundational steps become more important and their absence more glaring.”