AI Use in the Classroom

From LCC’s student newspaper, a variety of views on AI – including my own.

This article is part of The Lookout’s LCC x AI series, a multi-part series on conversations surrounding AI and its impacts at Lansing Community College. 

By Nicole Wadkins
Staff Reporter

Generative artificial intelligence (AI) has quickly become a commonly used technology on campus. In recent semesters, its presence has become even more visible due to each class requiring to include a policy on AI use in the course syllabus. Depending on the professor, those policies include not allowing AI, allowing the use on certain assignments, or encouraging AI to be used.

Unlike traditional artificial intelligence models, which focus on analyzing data and recognizing patterns, generative AI creates content based on the data it has been trained on. Generated content can include written responses, images, videos and more. As these tools become easier to access, they are starting to reshape how students are approaching their coursework.

For students, opinions about the use of generative AI in education is mixed. Some see it as a useful tool for fact-checking and brainstorming ideas, while others are concerned about ethical and environmental impacts.

LCC student Mason Stroman stated that he uses Google Gemini “stats for sports, or facts I read from history books. Usually I just type in the question I’m looking for,” he said in an email. “Sometimes things I read don’t seem believable, so I like to double check them on Google Gemini.”

Then there are other students who avoid using generative AI almost entirely.

LCC student Megan Wilson stated that she does not use ChatGPT or Gemini due to personal and political reasonings. “I believe that it is unethical in both the traditional sense of ‘I should be able to think on my own,’ as well as the environmental impact from water usage spikes in processing plants for cooling,” she said in an email. “The only form of AI I use is on the very rare occasion I cannot find a direct answer on Google or can’t get a tutor for a math problem that I am really struggling on. In that case, I will use Gauth AI to check my work.”

These mixed reactions also follow the same pattern from students in an earlier article, “What LCC students are saying about AI.” In that article, many students believe that AI can be useful while others are worried that being dependent on it could weaken students’ ability to learn.

Faculty members also expressed the same range of views about generative AI, though it depends on the class—especially those where analysis and writing are critical to learning.

Assistant Professor of Political Science Michael Giles expressed that he does not want his students to use generative AI in his class. Although he does recognize that the technology may have advancements in science, he also notes some problems with the use of generative AI. “AI use in the classroom is pretty bad for students and the reason is because learning is inherently difficult,” Giles said. “In fact, [learning is] one of the best things you can do, but we also know that the best things are also the most difficult things.”

Giles explained that students who use generative AI on their assignments no longer put in any effort; the software gives students quick answers without deeper thinking. “When that happens, it cheapens knowledge,” Giles said. “It kind of teaches us that no knowledge is worth the effort, because we can have it without effort.”

Lead Faculty for Art History Rebecca Bieberly stated her students are also not allowed to use AI. “I also don’t support students using AI personally, at least within the classroom,” she said. “Art history is, at least in part, an interpretive field, and it’s a field in which it really requires the individual to do the looking, the analysis, the contextualization, the research. And equally as important, the writing.”

She expressed that students are actively engaging with images and ideas in the course. If students used AI, then it would disrupt the learning process for the students. “It’s unfortunate because it takes away from their opportunity to be able to deepen their own skills with writing and thinking,” Bieberly said. “It makes our classes worth less for the student because they’re not actually getting what they paid to get out of our classes, which is to learn to write better and think better and be a more connected global citizen.”

But there are also faculty who don’t rule out generative AI.

Professor in Computer Information Technologies Adam Richardson stated that students need to have a strong foundation in their field, and that includes experience in using AI tools since they are becoming more common in the workplace. “I want our students to succeed wherever their path takes them, and integrating AI content within pathways allows students to explore ethical concerns, strengths and weaknesses, and human-in-the-loop workflows that will help them stand out among their peers,” Richardson said in an email.

While he supports integrating AI into some coursework, he also emphasizes that students still need to develop their own voice instead of relying on AI to speak for them. He noted generated AI content has started to become common on platforms like LinkedIn, which makes those posts repetitive. “We want to give our students the confidence to share their authentic communications so we can all benefit from their real experiences,” Richardson said in an email.

English Program Faculty Bex Miller also supports limited use of generative AI, but cautions students not to rely on it. “I would want my students to use AI generatively, to brainstorm ideas and ask questions, to find source material possibilities and outline their thoughts,” she said in an email. However, she also noted that “students tend to use AI to ‘say it better’ than they could, but the reality is that students can only learn to say it better if they participate in the writing process.”

Miller added that she has noticed that students who do rely heavily on AI often show overconfidence in the technology. “I notice what I would call “patchy” organization in most AI submitted work,” she said. “Students sometimes trust the responses from Google AI without verifying, skipping an essential part of the research process.” Miller also noted that she sees a “decrease in student confidence and critical thinking abilities with increased AI use.”

While there are mixed opinions about generative AI across campus, there is one thing we can agree on: generative AI is something that students and faculty can’t ignore. The technology is here and won’t be going away anytime soon. How we allow it to impact student learning is a continually evolving conversation.

Leave a comment