An AI sign at the World Artificial Intelligence Conference in Shanghai on July 27, 2025. (Ying Tang/NurPhoto via Getty Images)

by Boris Nusinzon and Andre Garcia

Generative AI has rapidly evolved over the past few years. The pace of development of the technology has been staggering: Ever since the release of ChatGPT in 2022, generative AI use has soared. Statista has found that over a billion people worldwide are expected to use AI by 2031; that the global AI market will be worth over $800 billion by 2030; that 51% of American adults have used AI to search up an answer to a question; and 36% of American adults will use AI for searching online by 2028. Clearly, generative AI is not going anywhere, and it has inched its way into every nook of society. In Northern High and across the country, AI has a myriad of applications and uses. These include brainstorming ideas; planning; video, image, art, music, and writing generation or assistance; AI coding; summaries in search engines; and even therapy, counseling, friendship, and romance. But when focusing on what impacts students the most, one of the sectors that AI has had an enormous impact upon has been education, something that ranges all the way from schools up to colleges. 

The national statistics on student AI use paint an intriguing picture. College Board found in October 2025 that 84% of high school students reported using Generative AI for schoolwork, and that 57% of parents agree it is better to use generative AI for schoolwork than not. Over 85% of school administrators “view students’ learning to use AI tools as part of their high school education as very or somewhat valuable,” but “principals, AP coordinators, and teachers still have concerns about AI affecting students’ academic integrity, students’ essential learning skills, and teachers’ technical and professional support needs.”

AI has many uses for students, demonstrated within a 2025 Microsoft Special Report on AI in Education. 37% of students surveyed used it to help them get started on and brainstorm assignments; 33% to summarize information; 33% to get answers or information more quickly; 32% to get initial feedback on their work; 30% to help them learn or study in a way that works best for them; 28% to improve their writing skills; 25% to make their presentations and projects more visually appealing; 22% to develop the skills they need for their future; 22% to do their assignments for them; 21% to enhance their creativity; 21% to free up their time and focus on learning and connection; and 19% to help them better take care of themselves and their wellbeing.

In light of these statistics, there are differing perspectives as to whether AI usage in schools is beneficial, harmful, or somewhere in-between. Many feel that generative AI helps them to better come up with ideas or to speed up their workflow. Indeed, in the aforementioned study (though it should be noted that Microsoft has significant bias), they explain how “In a recent study from Australia, university students who used an AI-powered chatbot saw a nearly 10% improvement on their exam grades over peers who weren’t using the tech. Predictably, AI use peaked during the weekend before the final exam; after the test, 72% of users stated they would be very disappointed if they couldn’t use it again.” Additionally, “In a study conducted by Microsoft Research and Cambridge University Press & Assessment, students preferred an AI reading assistant over traditional note-taking, citing its ability to answer questions, simplify complex material, and provide immediate feedback.” 

Education Week offered up a further positive perspective, arguing that generative AI will continue to grow in popularity and be utilized by students, and so its use cases should be “contained,” but carefully encouraged. Kevin Bushweller explained in a September 2025 article how “Educators should also teach high school students—and maybe middle schoolers, too—how AI can be used to solve some of the most complex problems facing society today.” He made an argument for schools carefully regulating AI use, teaching students how to use it properly to encourage critical thinking and brainstorming—essentially, as a tool to enhance teaching.

Of course, the other viewpoint is that generative AI is harmful for students overall. Potential concerns are myriad—some floated by students in the aforementioned Microsoft study were the possibility of AI use being considered to be plagiarism or cheating, potential for overdependence, misinformation due to AI hallucinations, ethical concerns, and privacy and security concerns. Educators, meanwhile, were worried about many similar problems, in addition to unique issues such as insufficient training to sufficiently understand AI, a lack of clear policies and regulation, and inadequate guidance for students on AI usage. 

A different Education Week article further explains some of the downsides of AI use, detailing how “One of the negative consequences AI is having on students is that it is hurting their ability to develop meaningful relationships with teachers… Half of the students agree that using AI in class makes them feel less connected to their teachers. A decrease in peer-to-peer connections as a result of AI use is also a concern for teachers (47%) and parents (50%)…” Indeed, 70% of teachers “worry that AI weakens critical thinking and research skills,” though teacher usage of AI has increased. The article touches on how teachers have increasingly been using AI, something that has led to mixed results, as they do not have enough training on how to best utilize it within education. Particularly, less than half of teachers have “participated in any training or professional development on AI provided by their schools or districts,” and students oftentimes are uncertain how to acceptably use AI.

An image of a human hand halting a robotic one holding a pencil. | iStock / Getty Images

The most concerning findings come from research which has investigated AI’s effect on the brain itself. An MIT Media Lab study found that AI actually harms one’s critical thinking skills; Time reported how “The study divided 54 subjects—18 to 39 year-olds from the Boston area—into three groups, and asked them to write several SAT essays using OpenAI’s ChatGPT, Google’s search engine, and nothing at all, respectively… of the three groups, ChatGPT users had the lowest brain engagement and ‘consistently underperformed at neural, linguistic, and behavioral levels.’ Over the course of several months, ChatGPT users got lazier with each subsequent essay, often resorting to copy-and-paste by the end of the study.” The brain-only group had the highest neural engagement, and those who utilized AI to fully write their essays saw poor recall of what was written after the fact.

So, what does all of this mean for Northern? How has increasing AI use affected students at this school? A school survey which collected 203 responses from students across varying grade levels found an average approval rating of 2.71 out of 5 for generative AI use in daily life; 59% of students use AI at least once a week not for schoolwork, and students most use AI for brainstorming (17%), image and art generation (13%), and Google/Bing search summaries (20%). There was a 50/50 split between students who did and did not use AI to significantly help them with schoolwork, with an average approval rating of 2.14 out of 5 for doing so. Students were significantly less positive about using AI to cheat on a product grade or test, however, with a 1.53 out of 5 rating for doing so, and 78% of the population claiming they had never cheated using AI. 

In order to better interpret these results, three students (whose names will be kept anonymous for privacy) and two teachers were interviewed. The first of these was with a senior who frequently uses AI. He explained how “…sometimes I use AI to just gather some thoughts, especially for last year I used it a lot for AP World, and it helped me gather my thoughts really easily with assignments like the big packets that we got, so I could just filter in information and learn a lot easier.” This student also noted that “I use it for… exams… that I need to take for my job… and then I also use it for my workouts, I also create AI-generated workouts that also help me with the gym and stuff like that.” 

He also added how it has changed his life negatively due to a “desire to use it so I don’t have to do any other work… it’s easy to just look up something and just use AI to do it.” He iterated how he occasionally uses it to get notes down easily or check his work, but typically does the work himself. The senior continued by saying that AI usage has “contributed to my GPA and boosted… my GPA because I use it to study and… to take my own practice tests.” He noted how it improved his grade and critical thinking in general, though he does not find “cheating… good in any moral way.” In his view, AI “should be used as a tool, and the basis on cheating with AI use should be… if you’re using it to literally write out or… do your whole assignment for you, that is cheating.” Holistically, he disapproved of using AI to do the whole of the writing for you, but believed it acceptable to use it for assistance and editing.

A sophomore who was interviewed had a contesting opinion. “I never use it because I don’t think it’s right. I think you should, like, whenever someone does something—especially me—I find it’s better to actually learn if I’m doing the research myself instead of having something do it for me. It’s to help keep me from being lazy…. also, it’s not always correct, so I don’t really see that changing in the future, because I don’t see it being more accurate.” She added that she might approve of AI use “when people are, like, struggling really bad and can’t understand anything, AI might be able to summarize it well if it does it correctly.”

The sophomore continued by noting that AI is “going to cost a lot of people jobs, and the only way AI would actually be beneficial is if it was 100% perfect and it could replace everyone’s jobs, as in no one would have to work a day in their life.” However, she commented that this was not going to occur, and concluded by noting that she has good grades due to not using artificial intelligence—something she did not see changing in the future.

The third and final student interviewed—another sophomore—explained that he doesn’t use AI because it “destroys the planet, it uses water. I think it also diminishes hard work that people put in for art or making of art, and it makes it like… I don’t know, it kind of just removes a lot of, like, the creativeness that needs, like… kind of just removes the human aspect of a lot of things. I would see it, in my opinion, change in the future if there’s regulations put on it, if it doesn’t destroy stuff or… need a lot of usage of our energy.” He noted that AI has led to lower motivation and created a sort of “laziness” in society, with it being unacceptable to use AI on assignments due to it leading to not actually learning anything. The sophomore noted that it is acceptable to use it for “help studying,” and he expressed (in contrast to the previous student) that using AI would actually improve his grade, though he has no plans of doing so.

The first of two teachers interviewed was Mr. Sinclair, who teaches both 9th and 12th grade English. He expressed how he has personally utilized generative AI to generate images for specific uses, and has also found it helpful to “dig through texts for examples of things.” He finds it to be a “useful tool if we use it for the grunt work, not to do the thinking for us.” When asked about the impact AI has had upon the classroom and the learning of students, he explained that “…people are using it as an easy out to substitute for the thinking that they need to do, and that’s unfortunate because it feels like we’re getting it backwards. Tools like this are supposed to make it easier to do the grunt work so that we can do the thinking, not the opposite.” He expressed how everyone is always “looking for an easy way out,” and so the average person opts to use AI to simply get an answer to a question. He detailed how AI is much more useful when used critically: If a student uses it to “help them identify examples of something they’re already looking for, and then packaging it and putting in their own words,” he finds that to be okay. His problem was primarily with students putting in an input, getting an output, and not taking any time to “process” or analyze the information in-between, simply accepting whatever answer AI provided. 

When it came to statistics on cheating, he noted that students in higher level courses (particularly those who had a higher level of accountability, like AP exams) would be less likely to utilize AI, given that they would find themselves in an environment without it being available. With that being said, Mr. Sinclair commented that he most definitely notices when students used AI to cheat, and typically confronted students by asking them about the level of vocabulary used in their writing. Part of his approach to student cheating has also become preventative, shifting to making “the abstract thought the important thought of the assignment” in order to forestall cheating. Students who are found cheating get a zero and are invited to do honest, non-plagiarized work, as the “thinking’s the important part.” He reiterated how those using AI are “skipping the processing part of the cycle,” and he has seen certain skills of students “start to atrophy” as a result. He finished by commenting that this is likely the “calculator moment” for the humanities, which may need to “move into the correct way to parse the problem and set up the problem to get the correct response.” He noted that it may take a long time for this to happen, however, given that education struggles with making determinations on what to do with new technology.

Mr. Williams provided similar but unique viewpoints in his interview. He began by explaining that generative AI has its benefits outside of the classroom—he had a student last school year who used it to help them study, quizzing themselves with it. He saw how it can “save time and energy within the classroom,” but “the ways in which I see it used sometimes negates students’ abilities to practice skills and strategies that I want them to work on, and also the larger kind of general themes and developments.” Students who use AI in the classroom have seen a decrease in their ability to “synthesize an argument,” and while he is “okay with students using it to be able to search out additional information,” he sees this as becoming dangerous quickly. This is due to AI oftentimes hallucinating; he brought up two instances of AI search summaries providing blatantly “wrong” information. Williams expressed his concerns that  “…it’s doing more and more to generate a response that will work for you. So like, they’ll even come up with quotes and primary sources that aren’t even real… and they’ll usually have a caveat, like, ‘This is based on the popular notion of the time.’” 

In the past, Williams has spoken to students about their AI usage, but has not had to “refer anyone for… academic dishonesty.” He has done his best to ensure that “the assignments are meaningful, and I am transparent with the students that… what we do in class… [is] helping you to get those skills or that knowledge base that will help you later on down the road.” He emphasized that cheating only hurts the student in question (something that Sinclair had touched on as well), and he tries to “make sure that students see that I’m not just giving work to be busywork.” When it comes to teaching, AI has assisted him with lesson planning, informational text formatting, and accounting for student responses. 

He closed the interview by mentioning the advent of “charter schools where AI is doing all the… teaching.” In these schools, they have “facilitators,” and the idea has made Williams nervous that AI could “just devalue education.” While he understands the technology’s positives, he fears that the focus on using AI to “try to cut redundancy” could harm students. More specifically, he thinks that teachers have simply not had enough training with AI, and applied a similar analogy to calculators that Sinclair did. It is “another tool that can be used to make life simpler,” but you also cannot “completely abandon” crucial skills that AI has the capability to perform. 

As generative AI continues on its rapid pace of development, its full impact on the future of education is uncertain. One thing is clear, however: Education—and the world—will not be the same.

By BorisN

Leave a Reply

Your email address will not be published. Required fields are marked *