ChatGPT on Writing
Artificial Intelligence's Role in Education
I have to give my school credit for the creative way they caught Richard. The teacher reviewed the history of his edits and discovered the spontaneous appearance of chunks of text. She immediately suspected that Richard had copied and pasted the text in, so she asked Richard for his source document where he wrote the text. He didn’t have one.
Richard was drafting a news story for his journalism class and decided to write about the basketball season. Conveniently, the basketball coach records every game in the form of an email that he would share with the entire school. Richard read all of them to gain inspiration, and he summarized the great insights with ChatGPT, using them as the majority of his engaging narrative.
During the hearing, Richard explained his belief in AI’s increasingly important role in the future, so he perceived the usage of ChatGPT as a skill that he wanted to be proficient at. Richard saw AI as a tool similar to PowerPoint in the ’90s — an essential ability that would only benefit him more if he started to develop it earlier.
I could not help agreeing with many of Richard’s takes on AI. In education, many teachers have thought about the use of AI and embraced the technology as a helper in the classroom. They have used ChatGPT to improve the curriculum, generate examples, and in some cases even grade students’ assignments. In fact, the Impact Research/Walton Foundation survey found that “88 percent gave the AI program a good review, saying it has had a positive impact on instruction.” (Edweek). These viewpoints clearly support Richard’s opinions on AI’s increasing prominence.
Richard conveyed the most honest and logical thinking of a lot of high school students. Especially since most have realized the inaccuracy of AI checkers, why sacrifice your free time when this magical device can convey your thinking in a more fluent manner? Then, I suddenly realized how this abuse of this technology can be immensely problematic.
Many believe that AI is a crutch that shields one’s authentic work with perfectly articulate sentences and prevents young adults from actually learning the content. Additionally, they doubt the technology’s ability to construct complete logical arguments because text generators don’t critically think like humans. As a result, journalists such as Rodolfo Delgado have realized that “[ChatGPT] lacked the touch of humanity that was inherently mine …” (Forbes). Currently, phrasing thoughts into coherent sentences is essential for communication, but in the future, would writing and expressing emotions not matter anymore? A wave of dystopian possibilities flooded into my head.
To satisfy my curiosity, I have to know the appropriate uses of ChatGPT, but my judgment requires a deeper understanding of how well ChatGPT writes in reality. Thus, I spent my entire Thanksgiving break drafting a paper analyzing AI’s effects on work and compared it to a ChatGPT-generated essay on the same subject.
In short, I can confirm Delgado’s claim that generative AIs’ analytical writing is suboptimal. The robotic, formulaic paper overemphasizes summaries, provides minimal references to studies or statistics, overuses subheadings that destroy continuity, and repeats information it has already mentioned.
After this experiment with ChatGPT, I developed a framework to decide whether an assignment should incorporate AI’s help. For homework that is focused on content and facts, ChatGPT is really helpful for summarizing complex events, concepts, or philosophies. We can use ChatGPT like a search engine. However, unlike Google, where we can get an infinite number of unsatisfactory outcomes from a search, ChatGPT provides a concise response that answers the question directly. This process of learning can save us a lot of time from scouring the web for specific topics such as the Cultural Revolution’s effects on Chinese feminist movements.
However, we should rarely use AI when assignments shift away from simple summaries and explanations, requiring us to make connections and apply critical thinking. When attempting to generate thoughtful answers to more analytical prompts, ChatGPT tends to output complex sentences with abstract nouns with an assertive tone, yet when we try to decipher their meaning, we realize that the machine is simply stating the obvious. In fact, to avoid biases, OpenAI specifically programmed their chatbot to express few opinions — it wouldn’t argue anything substantial. Thus, in these cases, using the technology is instead a waste of time.
Educators should not dissuade students from using ChatGPT because there is definitely space for such technologies in today’s education. Banning AI in learning is like scrapping the invention of nuclear power because of atomic bombs. In fact, treating ChatGPT as a taboo would only increase students’ curiosity to explore it themselves (as I have), and many would not consider the ethical implications. We must teach students the proper applications.
I have benefitted too much from ChatGPT’s intuitive interface and coherent clarifications to disregard it as a gimmick that causes harmful reliance. I remember when teachers despised grammar checkers such as Grammarly a couple of years ago. If that could integrate so well into the curriculum that my school now offers students free subscriptions, I believe ChatGPT can do it, too.
References:
https://www.axios.com/2024/03/06/ai-tools-teachers-chatgpt-writable