52°F

To acquire wisdom, one must observe

Are schools becoming too complacent with AI?

Throughout almost all of my K-12 education, I was again and again taught how to properly do research online and write about said research. In fact, one of the core skill sets I was told I needed going into college was how to find information for myself. That remained true all the way to junior and senior year of high school, when AI was the big craze. Apparently, students didn’t need to do their own research and write their own work now; AI tools such as ChatGPT could do that for you! Use of these tools, however, still violated my high school’s academic honesty rules, and caught use of them would lead to steep consequences. I figured this would remain the case in college, where the stakes seemed steeper, yet as I prepared over the summer to leave for Brandeis, all of the vague talk about the school’s AI policy caught me by surprise.

Brandeis has prepared a much looser set of rules and regulations when it comes to AI software such as ChatGPT, I now realize. On the Brandeis Center for Teaching and Learning’s website, there are guidelines for “ChatGPT and AI”, yet rather than consisting of direct instructions to us, the students, it instead gives suggestions for the professors and teaching assistants. For example, it proposes the idea of letting students use AI for the “drafting process”, but claims that the output writing may be “plausible-sounding but incorrect or nonsensical”. When I click on the embedded hyperlink on that phrase to better understand what that means, I am met with a “404 Not Found” error, which is very reassuring. Meanwhile, in my Brandeis-required University Writing Seminar (UWS) class, the professor actually encouraged us to use AI to help us research, by generating us resources, automatically formatting them for citation or even summarizing them for us.

This sudden shift in atmosphere towards AI and research in general has left me conflicted. On one hand, the ease of access to sources AI presumably gives to researchers seems quite invaluable; on the other, though, there is no denying that generative AI as a whole is deeply rooted in theft and plagiarism of real humans’ works. For instance, training these AI systems requires massive inputs of data to train it on, and tech companies such as OpenAI were found to have stolen millions of essays and articles from copyrighted sources such as the New York Times. As a younger student, I was repeatedly warned that any sort of plagiarism in college would mean “the end of your academic career”. This idea continues to deter me personally from using AI, knowing that it can simply rewrite others’ work and pose it as its own unique content, which I may unknowingly use. Overall, though, while AI for writing has a lot of pros and cons, choosing to use AI while at Brandeis should really be up to your own comfortability and moral compass.

Get Our Stories Sent To Your Inbox

Skip to content