Will our educational system keep pace with AI? A student’s perspective on AI and learning

By a High School Junior and AI Club President | Boston Parents Paper

Back in sixth grade, I wrote a short story using Microsoft Word, spellcheck, and a lot of trial and error. Fast forward to junior year, and my friends are using large language models to write essays, generate code, and brainstorm debate rebuttals. The pace at which AI tools have entered our classrooms is kind of wild—and honestly, school systems are still catching their breath.

As president of my high school’s AI club, I’ve seen both sides of the coin. Students love AI for its convenience and speed. Teachers, on the other hand, are still figuring out what to do. Is it cheating? Is it a tool? Or is it both, depending on how you use it?

what schools teach vs what students use

Let’s start with the basics. Schools are still catching up to tools that students are already using daily. According to a 2024 report by Common Sense Media, 58% of high school students have tried ChatGPT or a similar AI tool at least once. Yet, only 15% say their teachers have talked to them about how to use it ethically.

There’s a kind of underground culture of AI use in classrooms. Students use tools like ChatGPT, Grammarly, and Quillbot to rewrite, summarize, and brainstorm. Sometimes it’s to help when they’re stuck. Other times it’s just to get the assignment done faster. But it’s rarely discussed openly. That creates a weird situation where students feel like they’re doing something wrong, even when they’re just using AI to check their grammar.

AI as teacher’s aide or grading nightmare?

Teachers are facing a dilemma. Do they ban AI tools altogether or integrate them into lessons? Some teachers have embraced it as a brainstorming or study tool. Others still issue warnings about plagiarism. I get it—this stuff is new, and no one wants to encourage shortcuts. But pretending it doesn’t exist doesn’t help either.

According to The EdWeek article “AI in K-12 Education: Where It Stands and Where It’s Going”, schools that train teachers in ethical AI use see fewer incidents of cheating and higher student engagement. That suggests it’s not the tools that are the problem. It’s the lack of shared norms around using them.

case study: our AI club’s classroom workshop

This spring, our AI club led a classroom session on how to use ChatGPT responsibly. We showed how it can help you break down confusing texts, generate practice questions, or simulate historical debates. But we also showed how using it to write entire essays without citation is, yeah, not okay.

We surveyed students before and after the workshop. Before, 72% said they weren’t sure if using AI was “allowed.” After, 91% said they’d be more likely to ask their teacher how they could use it appropriately. That wasn’t about fear. It was about creating clarity.

standardized testing and AI blind spots

Here’s another place where schools are lagging: standardized testing. AI tools can now help students analyze SAT questions, generate AP-style essays, and simulate problem sets. But testing formats haven’t adapted. That means the way we measure student performance doesn’t always match how students actually study or learn.

A November 2024 Forbes article reported that the College Board is integrating ethical AI scenarios into its digital SAT prep materials. That’s a start. But it’s not yet a full conversation at the classroom level, especially for middle schoolers or ninth graders being told not to touch AI at all.

teachers need time, not just tools

One of the biggest barriers is time. Teachers already juggle so much—lesson planning, grading, IEP meetings—and now they’re supposed to become AI experts too? That’s not realistic unless districts support them with training time and clear frameworks.

Education Week found in a recent survey that 64% of teachers would feel more comfortable using AI in class if their school or district provided more guidance. That includes understanding data privacy, AI bias, and how tools like Sora or ChatGPT can be used to enhance—not replace—critical thinking.

what students actually want from AI at school

We don’t need AI to do our work. We need it to help us understand the work. When used right, it’s like having a smart study buddy who doesn’t sleep. In our club, we’ve used AI to model how a bill becomes a law, simulate economic systems, and help classmates with disabilities access content in new ways.

Students want school policies that reflect the tools they’re already using. That way, no one has to hide anything. We can talk about it openly and learn to use it well.

It helps to start small. Ask your teacher what their AI policy is. Suggest a collaborative approach. Offer to help run a workshop or show examples of appropriate use. And if you want to help even more? Start a club. You don’t need to be an expert. You just need to be curious.

If you’re a student interested in exploring the role of AI at school, this is one of the best ways to get involved and lead change from the inside.

HOW TO START A HIGH SCHOOL AI CLUB: 6 EASY STEPS FOR SUCCESS