Skip to main content Skip to header navigation

High School Teachers Are Going Old-School to Outsmart AI — & Students See It Coming

If you purchase an independently reviewed product or service through a link on our website, SheKnows may receive an affiliate commission.

At a recent conference for education writers, I was chatting with a colleague — also the parent of a teenager — about what AI looks like in real classrooms. Her son’s high school English teacher, she told me, had reached a breaking point. Too many ChatGPT-authored essays. Too many suspicions. Not enough time to investigate them all. So for the final, she’d handed out blue books — those old-school essay booklets we associate with college midterms — and had the students write out their essays by hand, in class.

She’s not the only one. Since ChatGPT debuted more than two years ago, those pale blue exam booklets — relics of college classrooms past — have been staging a quiet comeback. According to a recent article in the Wall Street Journal, demand for the product has risen significantly at a number of U.S. colleges and universities. But what started on college campuses is quickly trickling down: high school teachers are reaching for them, too, in a bid to keep students accountable when typed work no longer feels trustworthy. 

The question now isn’t whether AI is changing education; it’s how far the pendulum will swing in response. Will the rise in AI-assisted shortcuts push teachers to abandon research papers and typed essays entirely? Could we see a widespread return to in-class tests, oral exams, and handwritten work — not just in college lecture halls, but in middle and high school classrooms nationwide? And if so, what might students gain — or lose — in the process?

The extent of the problem

Cheating, of course, isn’t new. For as long as school has existed, students have found ways to game the system — downloading essays, copy-pasting from Wikipedia, using translation tools, or just handing in a friend’s work as their own.

But AI is different: faster, slicker, harder to detect. And often, it produces work that looks just good enough to pass. When a high schooler drops an essay prompt into ChatGPT, they’re technically doing what was asked of them — producing an essay. But in the process, they’re skipping the critical thinking and cognitive struggle the assignment was meant to teach.

Randy Stearns, a high school history teacher at Trevor Day School in New York City, says that when it comes to classifying “cheating”, the line isn’t always easy to draw. “It is against policy to turn in work that’s not your own,” Stearns told SheKnows. But if a student uses AI to “Frankenstein together” an essay — rewording AI-generated text, reorganizing the structure, putting it into their own voice — it gets trickier because, as he points out, “That’s not really that different than using the old Google search.”

Deciding how much help is too much often comes down to gut instinct and time-intensive analysis. “Every single one of those is a case-by-case basis,” Stearns added. “It takes five times as long to grade something as it used to.”

Even for educators experienced in spotting AI-generated writing, confirming it can be a slog. “I check it through three or four different apps,” says college English professor Hannah Greico. “Then I talk with the student. Not an accusation — a discussion. Then we make a plan for them to resubmit, with any help they need.”

Often, that conversation is enough to deter students from using AI again — at least until stress spikes during finals week. And then, she says, the process is repeated.

At what point does AI use cross the line into cheating?

Not all uses of AI are considered academic dishonesty, and that gray area is part of what makes it so hard for teachers to regulate. In a study by the Digital Education Council, 24 percent of students reported using AI to “create a first draft” of their paper, while 69 percent used it to “search for information” — which isn’t really any different from using Google. Forty-two percent said they used it to check grammar, which isn’t much different from spellcheck. Thirty-three percent said they used it to summarize documents, and 28 percent to paraphrase a document — both tasks that instructors probably prefer kids do on their own, but perhaps not so different from working with a tutor.

“In general, AI use crosses the line when kids start cognitively offloading learning tasks they should be doing to develop a skill they need,” says Rebecca Winthrop, director of the Center for Universal Education at The Brookings Institution and co-author of The Disengaged Teen. “Writing an essay is one of the things that they should do on their own.” However, she noted that for some parts of the writing process — such as research and copy editing — AI can assist students without thinking or writing for them.

Are in-class tests the solution?

Students are picking up on the shift in classroom expectations, too. “I do think that AI will bring back a lot more in-person assignments,” said Chloe, 18, a member of the SheKnows Teen Council. “I feel like it’s kind of damaged the way that we do our work. I feel like it’s a huge question nowadays whenever you submit something, like every teacher’s kind of like ‘just so you know, I’m going to scan it through AI’ and ‘I’m going to put it through the generators.’ So, I think that a lot of teachers are going to shift to more in-person essays and quizzes just to make it easier on their end.”

Not every teacher has caught up to the scale of AI use — but students know the shift is coming. “I think that teachers don’t really realize how much you can use AI for yet,” said Clive, 16, also a Teen Council member. “So I get a lot of essays and work to do at home that you can easily use AI on and I think they’re going to realize in the next couple of years that like how much you can use it, and how hard it is to detect, and I think that’s when there will be a shift toward more standardized testing in class or things they can moderate and make sure you’re not cheating on.”

The SheKnows Teen Council’s own survey supports what students are saying: 84 percent of teens reported using AI for homework, compared to just 52 percent for in-class assignments, and only 4 percent for in-person quizzes or exams. In other words, the more a task happens in real time, the less likely students are to use AI. And most teens know what that means for the future: 72 percent said they believe AI will make in-person essays and tests more important.

“In-class participation [and] in-class writing will be graded a lot more heavily and will be a much more main part of the grade and a much more main part of how you do in the class, rather than take-home assessments or take-home essays,” predicted Teen Council member Juliet, 17.

Even in STEM classes, teachers are starting to recalibrate how and where students do their work. “I see AI as more of an issue with homework assignments,” said one high school chemistry and environmental science teacher, who preferred to remain anonymous. “When students are in-class I have more oversight, but with homework, especially digital, it is easy to have AI analyze and complete the entire assignment pretty quickly.”

To spot potential misuse, this teacher has started checking timestamps. “In some cases, this past year, certain students were completing it too well to be done that quickly,” he said.

That experience has shifted his approach. “I do see myself using a little more of a balance between digital and paper,” he added. “With so much of a push to go paperless/digital since COVID, I think it’s a good idea to strike a balance and ensure more original thought and work in the classroom.”

The limitations of the blue book solution

The push toward in-class writing — whether on paper or on locked-down devices — is a direct response to the messiness of detecting AI-assisted work. Teachers want to see what students can do when they’re unplugged, under timed conditions, and working in real time.

That doesn’t always mean pen and paper. The College Board, which administers the SAT and AP exams, now uses an app actually named Bluebook to allow students to test in a secure digital environment. It’s a nod to the old-school essay booklet — but upgraded for the digital age.

Still, low-tech testing doesn’t work for every student. “If the point of an exam is to help kids consolidate knowledge, then we should give kids the time they need to demonstrate that they consolidate that knowledge, and some kids need longer,” says Winthrop. Students with disabilities may receive accommodations like extended time, but just sitting for a test can still trigger intense anxiety.

One possible solution is offering test retakes — though that, once again, adds more work to a teacher’s already overflowing plate.

How teachers are adapting

For Greico, who teaches college composition, simply switching to blue books isn’t enough. Instead, she’s redesigned her entire writing process to make it more resistant to AI shortcuts — and to keep students accountable for every step.

“I do a step-by-step outline, thesis, and drafting process. I require tracked editing on drafts, to observe their process. In the end, the amount of work required to use ChatGPT is more than doing it on their own! And yes, it’s more work for me. But it’s worth it, in my opinion,” she says.

That emphasis on process — rather than just the final product — is becoming a key strategy, even at the middle school level. Sandy Liebson, who taught 6th-grade English in Washington, DC, says real-time feedback during planning and drafting helps discourage AI use. “Students do quite a bit of planning on graphic organizers, taking notes in various ways, and using other structures to gather ideas and plan … This seems to help because it focuses on the kids’ ideas, allows for lots of feedback in real-time, and hopefully makes assignments accessible and generally pretty friendly to jump into,” she says.

Other teachers are exploring assessment styles that are harder for AI to replicate — like oral exams. “Which are good for some students,” said Stearns, “but terrible for other students. So I’m hesitant on that.”

Still, he sees value in variety. “One of my solutions is to combine a mix of things that are AI-proof that are oral, that are in class, things like that,” he continued.

Maura Ridder, principal of a STEM-focused middle school, tells SheKnows that teachers across departments are adapting in real time — especially in subjects where AI use has proven hard to manage. One of her social studies teachers now leans heavily on oral assessments, including both group discussions and individual exams. And her solution for English classes? “Having them write in blue books now, like back to old school.”

But in classes that revolve around tech, the solutions are trickier. Because students were submitting AI-generated code in computer science, Ridder says, “There’s no more, ‘Here’s your project. You work on it, and then you turn it in.’” Now, teachers check in with each student daily, requiring them to explain each piece of code as they go.

Without significantly smaller class sizes, it’s hard to imagine that kind of one-on-one verification happening at scale, especially in already-overburdened high school classrooms.

Grace O’Connor, a special education teacher at West End Secondary School in New York City, says she and her colleagues have also seen a rise in AI use, particularly in English language arts. “We’ve been moving away from having kids type assignments because we are finding that they are using AI,” she says. “So, we’ve been having them hand write more and more. Which they’ve been annoyed about — but for us, it’s more important to teach the basic skills.”

Still, teachers are hesitant to reject AI outright — since for some students, it can be a game-changer. “It can take a student’s research and outline it, which is so helpful for those with executive function needs. It can take an essay draft from a student with dyslexia and address spelling and grammar issues so it reads smoothly and coherently. It can help second language learners understand grammar rules,” Greico says.

“There is no escaping ChatGPT,” she added. “It’s here to stay, and there are tons of ways it can be used as a tool.”

Educators agree: students need to learn how to use AI responsibly. But they also need opportunities to show what they know — without a machine doing the work for them. The challenge now isn’t just AI-proofing the classroom. It’s asking teachers to teach two realities at once: how to use a tool that’s not going anywhere, and how to think without it.