Stranger in the Classroom
For good or ill, artificial intelligence is now a big factor in your kids’ education. How can we make sure it doesn’t cheat them out of true learning?

So I’m asking Lyndon Johnson, 36th U.S. President, about the wisdom of escalating the Vietnam War versus pulling our troops out. He’s giving me some interesting answers, but Richard Panicucci, assistant superintendent of curriculum and instruction at Bergen County Technical Schools (BCTS), seems to think I’m lobbing softballs. Panicucci is facilitating my conversation—and, of course, here LBJ has two other initials too: AI. The “Johnson” this visiting writer is questioning (the real one died in 1973) is a creation of artificial intelligence.
Panicucci’s students in “The Polarization of America,” the freshman elective he teaches, use once-classified transcripts of the late President’s conversations with his advisors to inform the tougher questions they ask—and they get answers. But can you really embody a complex historical figure with a machine? And if so, is there anything in education that a machine can’t do—or shouldn’t?
The question being asked in Bergen’s schools, and schools around the country, is no longer whether AI belongs in the classroom, but what to do with it now that it’s there. It’s already a presence in most school districts, to a greater or lesser extent. In March, Governor Phil Murphy announced the founding of a task force to explore the implications of AI for New Jersey’s schools. The state also awarded 10 grants to fund AI education in K-12 school districts, one of them to the Bergen Pascack Valley Regional High School district. While there’s no official statewide code for the use of AI in education, many districts, including Cresskill and Pascack Valley, are developing their own standards or have them in place.
You either address AI head-on,” says Peter Hughes, superintendent of the Cresskill public school district, “and say, ‘OK, this is the changing world that we’re in,’ or you hide from it and hope it goes away. That’s not going to happen”—whether educators view AI as a powerful educational tool, a threat to the educational process, or both.
AI’S AT SCHOOL
As at BCTS, students at Pascack Valley Regional High School in Hillsdale are already conversing with AI, though not necessarily in English. The technology has been incorporated into the school’s foreign language instruction (and in the teaching of English as a second language) as “a live practice buddy,” in the words of Noemi Rodriguez, Pascack Valley’s supervisor of world languages and English as a second language. (By “live” she means “in the moment,” as opposed to living and breathing.) Students can have a conversation with an AI chatbot in the classroom, speaking in the language they’re working to master, either orally (while wearing noise-canceling headphones) or by texting. Rodriguez calls this “an easy way for kids to communicate with a really, really smart language practice partner and maybe get some feedback.” Students tend to like it, she says, “because they feel like it’s not a high-stakes assessment situation, and so they can talk freely and make mistakes and practice.” And teachers like it because it offers students more individual practice time than they could reasonably provide.
While not every school in Bergen is incorporating AI in classroom activities, many are already using it to help make their teachers more efficient. AI efficiency tools such as MagicSchool AI, ChatGPT and Curod are helping teachers devise lesson plans, design and grade exams, and create scoring rubrics, leaving them more time to interact with students. Hughes notes, for example, that teachers can use AI “to create a lesson plan and align it with New Jersey Core Curriculum Content standards”—a job many teachers find particularly onerous and time-consuming.
A LEARNING ENHANCER
As AI continues to make inroads into the curriculum, it’s likely to have many uses, the best of which, says Andrew Matteo, superintendent of schools in Ramsey, will be as “a thought partner for students.” He cites a class debate for which students might prepare by devising their arguments and then asking AI what the counterclaims to those arguments could be. Along similar lines, Hughes says that “the best possible scenarios are when you have the product that you’ve created as a student”—say, an English or history paper—“and you ask AI, ‘What is it in this project that I didn’t think about?’ So that it can almost push you to the next level of learning.”
Panicucci says that AI should “increase the amount of thinking that goes on in the classroom.” His students have to think analytically, as a reporter would, to decide which questions would likely yield the most pertinent responses and which would probably be a waste of time. “Teaching history this way,” he says, “really puts the ball in the students’ court.”
Another way AI can benefit students without imperiling their creativity is in the realm of tutoring. It can be “this really smart tutor that’s accessible 24 hours a day,” says Rodriguez. “Yes, if students have questions, they can ask a teacher,” she notes—but teachers aren’t available 24/7.
AI could also make “teaching to the test”— training students to be standardized test takers rather than creative thinkers—a thing of the past. Hughes calls standardized testing “a really antiquated way of assessing children’s learning.” “It’s too limiting,” he says. “AI may actually give you much more insight into the full growth of a child” by offering an individualized assessment and taking into consideration students’ differing ways of learning and thinking.
POTENTIALS FOR ABUSE
Whatever is and isn’t happening in the classroom at the moment, students—particularly those in high school—are already using AI to complete homework assignments, and as long as homework exists, that’s not likely to change. “Students see AI as ubiquitous now,” says Matteo. “It’s the water they swim in.” Sometimes that dip into AI is teacher sanctioned: Jennifer Clemen, a social studies teacher at Janis E. Dismus Middle School in Englewood, lets her students use AI to research historical figures like Alexander Hamilton and Thomas Jefferson. But then she’ll ask them to reflect on the facts AI furnished them: “I’ll say, ‘Tell me how you would have reacted at the time: Would you have been more of a Jefferson or a Hamilton?”
Many teachers, says Clemen, lament the loss of older research methods that required significantly more critical thought than asking ChatGPT to summarize Alexander Hamilton’s greatest achievements. “The tools that students have available to them are amazing, and it makes their lives easier,” she says, “but it also concerns educators because we remember doing those deep dives into microfiche and the card catalogues.” The point is more than a nostalgic one. Students who grow up thinking that all information is an easy click away risk being too easily persuaded. They’ll also miss out on the intellectual challenges that traditional research once presented, along with the skills those challenges honed, including patience and persistence, critical evaluation and a deep engagement with the material.
Even more concerning is the use of AI to avoid thinking altogether. The plagiarism-detection app Turnitin determined that in 2023 students turned in some 22 million papers that were partly or mostly written by artificial intelligence. The technology is so ubiquitous that some students don’t understand why they shouldn’t use it. Luisa Gray, a high school English teacher at a Bergen County private school (who asked that we not reveal her real name or the name of the school), said her students have questioned why they need to learn to write essays in the first place, “since in the future we’ll just be using AI to write anything we need to.” Her response to them is that writing, among other things, teaches critical thinking—a skill they’ll likely need no matter how widespread the use of AI is in the future. But that hasn’t stopped them from handing in papers that have obviously been crafted with the help of, or written entirely by, a chatbot.
When school districts sit down to formulate an AI policy, their first concern tends to be AI-driven cheating and how to minimize it. Not surprisingly, technology designed to detect plagiarism by way of AI abounds, in apps like Turnitin, TextGuard, GPTZero and QuillBot. (Though some teachers, like Clemen, say they don’t need to rely on an app to identify the work of a chatbot: “We know our students,” she says, “and I understand what ChatGPT sounds like.”)
In fact, thanks to AI, homework itself could go the way of the slide rule. One way to ensure that students don’t use AI to cheat, says Matteo, is to “have kids do the work—something where you might have sent them home and said, ‘Come back with an essay tomorrow’— right in class.” If that work requires online research, teachers can use software that allows them to monitor students online and in real time, to ensure that those students aren’t asking AI to do the work for them.
If AI threatens to curtail original thought, it’s already wreaking havoc with grammar proficiency. A recent survey of language arts teachers in Bergen County found that 68 percent believe students’ grasp of grammar has declined over the past two years, likely because of AI and AI-related tools like spell-checkers.
AI can also be a threat to students’ privacy. The sort of AI systems used in the educational sphere (and elsewhere) often involve the collection of data, rendering users vulnerable to hacks and other misuse. And in the course of amassing data on students, those systems have the potential for bias: They may erroneously conclude, for example, that some students have lesser potential than others and respond to those students with less intellectual complexity.
“Another challenge for teachers is to try to teach students how to manage digital literacy,” notes Clemen. Those lessons would likely include obvious issues like plagiarism, but also how to spot so-called AI hallucinations—false information being presented as if it were factual—as well as privacy and other issues we may not even be able to foresee in these early days. “We can’t even predict what’s going to be happening in the next five to 10 years,” Clemen says. Whether AI will open up vast fields of educational opportunity or prove an educational minefield (or both) remains to be seen. The only certainty seems to be that, in schools across Bergen County, it’s here to stay.
And when anything this consequential becomes a permanent feature of the landscape, it means a historic change for us all. Just ask any deceased President.