In today’s world, with the rapid advancement of technology, artificial intelligence is clearly the next step in our evolution. Yet many educators and institutions have been quick to criticize and demonize this technology instead of embracing its potential.
The era of ChatGPT in higher education has ushered in significant challenges. Professors are increasingly relying on AI tools such as Turnitin to flag students’ work, often without justifiable cause, leading to unjust academic penalties. It is deeply concerning that an unreliable algorithm can undermine the integrity of a student’s academic journey and forever alter their life.
When Turnitin was first let loose into the world, it claimed it had less than a 1% false positive rate.
However, more studies are coming out showing that the false positive rate is relatively higher and more random than previously reported. In an article by the Washington Post, they found the false positive rate was actually 50%. Granted, the sample sizes were different for both studies.
But still, shouldn’t educational institutions hold themselves to a higher standard before risking a student’s academic future based on the potential use of AI?
In another article by Vanderbilt University, researchers claimed that “To date, Turnitin gives no detailed information as to how it determines if a piece of writing is AI-generated or not. The most they have said is that their tool looks for patterns common in AI writing, but they do not explain or define what those patterns are.”
Despite the growing evidence of these flaws, educators persist in trusting these unreliable tools, often without considering the devastating consequences for students, all while disrupting the environment and relationships between students and professors.
I’ll admit that until my third year in college, I had never opened a ChatGPT browser out of fear for my academic standing. But I did, because — let’s be honest — understanding legal cases isn’t simple, so I have AI make the readings into a podcast I can listen to on the bus to school. And writing 10+ page political science papers without an outline and no idea where to start isn’t the easiest thing to do, especially when the education system itself feels like an uphill battle these days.
Despite these tools, professors across campus continue to treat AI as the enemy. For me and many other students, however, it’s an invaluable tool. But first, let’s clarify what AI really is.
AI comes in many shapes and forms, including ChatGPT, Grammarly Go, Gemini, Google Notebook LM, Siri, Alexa and even face recognition, to name a few.
AI is literally everywhere, yet people act like it’s a brand-new thing. It’s time to acknowledge that AI is here to stay, and instead of resisting it, we should be finding ways to integrate it into our learning environments responsibly and effectively.
But with outdated educators and institutions who continue to vilify it, students are left to navigate an educational system that refuses to adapt to the tools that can actually enhance learning and efficiency.
Just this semester alone, I have encountered numerous types of professors: one who is open about letting their students use AI for their benefit, one who believes that polished writing is AI writing and one who doesn’t want it on their midterm.
But there are also professors on this campus who aren’t afraid of AI like ChatGPT and want to understand it.
Take, for example, Paul Hurh, an associate professor in the English department who has been at the University of Arizona since 2008. Last year, in one of his classes, he designed an assignment that asked students to analyze a text using AI, and then to critique the strengths and weaknesses of that AI analysis using their own knowledge of the text.
“My goal with this assignment was not to integrate ChatGPT but to get students to criticize programs like ChatGPT,” Hurh said.
This aligns with Hurh’s belief that, as students, we are expected to analyze information and apply critical thinking in both our classrooms and written essays.
“My worry is that programs like ChatGPT will take away the critical thinking aspects and come up with these ideas that students then use to write their papers. But as a professor, those ideas create a barrier,” Hurh said. He emphasized that if the inherent idea is taken away, then is the student really thinking critically?
Unlike other professors, Hurh isn’t quick to dismiss ChatGPT; he’s opened a number of conversations with his classes about its use, trying to better understand it to adapt his teaching, whether that be to combat AI or use AI. Only time will tell.
“My fear is that it’s the students who don’t believe their writing is good or their grammar is bad or that AI simply does it better, those are going to be the ones who reach for ChatGPT quicker and they are also going to be the ones who miss out because there is a short cut,” Hurh said.
And he’s right. From what will be my 3 years at this institution, education standards of today ask for students to be perfect from the jump, without allowing room for growth, mistakes or the process of learning. The pressure to meet high expectations right away, on the first assignment, no matter how big or small, can be overwhelming, especially when some of those students, like myself, are also working jobs. Yes, I mean plural. This perfection from the jump mentality ignores the reality that learning is a process focused on progress, and everyone progresses at their own pace.
This standard also stifles students’ creativity and critical thinking; it is in my experience that the English classes, focused on those in writing majors, are the ones where professors focus more on completion for grades rather than the percentage. This is not to say they don’t give feedback or expect better results in the next assignment. They reduce one stressor to give us that freedom to take risks, though, and it’s honestly why those are my favorite classes.
Some of my other classes have prioritized producing a perfect final product over appreciating the process of academic growth and the exploration of new concepts. In this environment, it is hard not to feel discouraged from taking risks or trying new approaches because the fear of failure is so strong.
And for someone like me who needs those A grades to get and keep scholarships, how can I not ask AI for help?
AI is not the enemy of learning; it’s a tool that can help students combat the pressures of modern education. AI can help alleviate some of that pressure by providing students with a supportive resource to refine their work, explore new ideas and develop their skills. Rather than viewing AI as a shortcut, educators should recognize it as a valuable tool to help students engage more deeply with the learning process. And until that changes, students will be left to fend for themselves in an environment that often prioritizes results over the learning journey.
Follow the Daily Wildcat on Instagram and Twitter/X
Sarah Arellano is a junior at the University of Arizona studying Journalism and Law. When she is not reporting, she likes to read all sorts of books.