Walk into any classroom today and the future glows bluntly on someone’s screen. Artificial intelligence (AI), once confined to science fiction, now writes essays, solves calculus problems, and even composes music. For schools, that presents a nightmare and an opportunity. With that predicament, one question arises: do you ban it, or do you teach students how to use it?
At SCH, the answer to that question is rather bold. This year, the school launched a one-of-a-kind class aimed at teaching students not just how to prompt AIs like ChatGPT, but also how to build their own models. The move reflects recognition that AI is reshaping many professions, and a gamble that teaching high schoolers to create algorithms is the best way to prepare them for a future no one fully understands.
“We don’t want students to just use tools passively,” said Kirker, the SCH teacher shaping this new course. “We want them to look under the hood. That way they understand what’s driving these systems, not just what comes out on the screen.” In most schools, AI means typing into ChatGPT or Gemini to get work done… a teacher’s greatest fear. SCH’s Center for Entrepreneurial Leadership (CEL) team, however, treats AI less like a scary cheating weapon and more like a powerful tool that demands understanding. Students don’t just prompt an assistant; they build image classifiers in Teachable Machine, dissect the Python code behind them, and even train sentiment analysis models with real-world customer review data. Later in the course, they debate AI’s ethical impact in ESPN-style “Around the Horn” showdowns and present capstone projects tackling industries like healthcare and finance. “This isn’t a how-to-cheat class,” Kirker emphasized, “this is about why this technology is powerful and how we can get its best use for any application.”
Zoom out from SCH, and the larger story becomes clear: no one fully knows how to integrate AI into education yet. The New York Times recently reported on Miami-Dade County Public Schools, one of America’s largest school districts, where administrators are experimenting with AI chatbots that are designed to help students with lessons and answer questions about coursework. Miami has rolled out AI training for teachers and is piloting programs with over 100,000 students.
The contrast with SCH is sharp. Miami’s approach is massive, centralized, and focused on scale: using AI to support as many students as possible. SCH’s approach is smaller, experimental, and focused on creativity: breaking down AI tools to ensure students grasp how and why they work. Neither is clearly right or wrong, but instead reveal how unsettled the role of AI in education still is.
AI already changes classrooms quite dramatically. Students can write essays in half the time, solve math problems beyond the reach of normal calculators, and pull information from across the internet with one simple prompt. It’s wild. But with the great power of AI at anyone’s hands, people are beginning to ask deep questions. Teachers worry about plagiarism and question the ethics of AI, while students wonder, what’s the point of learning something that AI can generate in seconds?
Now it may seem enticing that AI can write a “perfect” essay in less than a minute, but teachers emphasize that students are sacrificing their creativity for the convenience of AI. Starting last year, Gelhorn, an honors English teacher and a faculty advisor for The Campus Lantern, began adapting and changing her classroom to face AI. She states, “What prompted me to make a change in my approach to student writing was when I had students type an essay, and a lot of them came back eerily similar to one another, the quotes that students were using, the claims that they were making, the way they were incorporating their quotations. And when so many students have the same idea, it’s hard to trust.” She went on to emphasize that “even with sophisticated Chrome extensions that actually allow me to replay the student typing it just became too unclear. And so that was the turning point for me, where I just decided I didn’t really want to be like a police officer to my students and spend my time trying to figure out whether or not they were using AI. And so I decided to just have them write by hand, so that they got all their ideas out there, and so the work was done in their original voice.”
McDowell, the history department chair, sees her job as drawing a line between using AI as a tool and using it as a crutch. Like many other teachers, she believes that “if you just feed reading into AI and get the answers, you’re not improving your own comprehension. You’re not actually thinking.” That is why the History Department, along with the English Department, doesn’t allow AI usage during graded writing assignments like essays. However, some teachers do allow students to use AI to generate ideas when appropriate. As McDowell put it, “If a student has reached the end of their own ideas, they can ask AI a question to help push their thinking further.” She compared the use of AI to training: “Nobody can run a marathon without practice, and no student can build intellectual strength if they rely on AI for everything.”
Kirker echoes this of uncertainty and fear that AI is taking fundamental skills from students, so his course points out that AIs function as pattern finders, designed to generate content based on what they’ve been trained on. By training models themselves, even on a small scale, students encounter the flaws and biases built into the system. They see firsthand that AI is shaped by human choices. “If students only ever use AI as consumers, they’re limited to what someone else decided AI should be,” Kirker said. “If they learn how to create and utilize, they see the true possibilities, and the limitations of different models.”
That distinction matters. Many people treat AI tools as seamless, all-knowing assistants. In reality, they’re messy, biased, and oftentimes false, which many teachers are realizing as they read student work. But by building models themselves, students learn to approach the technology with both curiosity and skepticism.
Among Inter-Ac League schools, SCH stands alone in offering an AI course. That makes the program both pioneering but also small-scale. According to the Introduction to Artificial Intelligence Course Outline, students are working with basic models to “create working image classifiers using Teachable Machine and develop text sentiment analysis systems using Python programming,” so they’re not building large language models like ChatGPT. But the foundation is there. Kirker and the CEL staff are creating an environment “where students understand it’s like having a collaborative thought partner with a doctorate and don’t outsource their creativity for some algorithm.”
Back in Miami, the schools are thinking along similar lines. In The New York Times article, Daniel Mateo, assistant superintendent of innovation at Miami-Dade said, “A.I. is just another tool in the arsenal of education.” As with other instructional tools, he said, “We have to make sure that we use it ethically, that we use it responsibly, that we have certain guardrails in place—and that all happens through our vetting process.”
Kirker agrees that AI must be implemented ethically because “We’ve never seen a technology like this, where it can be both the best educational tool in the history of mankind and the worst learning crutch ever, all on the same interface.”
Step back, and the story here is not about just one school or one public school district. It’s about a cultural shift in how we think about intelligence, creativity, and learning. As Kirker said, “It has never been easier to offload work. But offloading work is the same principle as you know, if I’m working at a job and I’m supposed to write memos and I just give it to ChatGPT. How long before your manager thinks, why am I paying $100,000 to this person when I can get the same information for just $20 a month?”
In the new age of AI, the students who thrive will not be the ones who lean on AI to do their work; they’ll be the ones who understand it, challenge it, and use it as a partner. And at SCH, adaptation is inevitable… The only question is how?