Colleges say AI can be used positively in the classroom

Artificial intelligence can help students learn several kinds of skills, if used ethically.

Local colleges say using generative AI like ChatGPT in the classroom — with limitations — can teach students skills like critical thinking and judgement.

University policies obtained by the Dayton Daily News generally prohibit students from saying that the work of a generative AI is theirs. Students who use these technologies must cite their sources. Students who are caught can be punished in the same way as if they were caught plagiarizing or cheating on an exam.

Individual professors have been encouraged to use language in the syllabus about the use of AI, but administration of the local colleges and universities have said it’s up to individual professors if they’d like to use AI in the classroom.

In one example, Edison State Community College asked professors to use one of three policies in a syllabus this past semester. One policy banned AI, one encouraged the use of AI but required that it be cited when used, and one allowed some use of AI but said all final work must be the student’s own.

Christina Amato, dean of Sinclair Community College’s eLearning Division, said one conversation coming out of AI in the classroom is “soft skills,” like critical thinking and problem solving. AI can generate an answer, but it takes a human to determine what the correct one is.

“It is a little ironic and interesting to me that AI is advancing those conversations around soft skills such as audience context, the human element of solving problems and critical thinking,” Amato said.

Amato said she found most students are using AI appropriately, especially when teachers have talked to students about appropriate use.

“What we’re finding is that AI usage (in) classrooms and instances in which we would find it appropriate are providing some generally teachable moments for students more than kind of a gotcha and you get an F for plagiarism,” she said.

Amato said it is more about the ability to generate conversation with students on appropriate use of AI, because in some cases, the student didn’t understand the boundaries and limitations of what was appropriate.

These technologies have offered new teaching methods for professors, too.

Wright State University allows some use of AI in classrooms, but requires it be cited when used. It also cannot be used “to substantially complete any assignment or exam,” according to the policy.

Tanvi Banerjee, a professor in Wright State’s Department of Computer Science and Engineering, teaches a graduate-level class called machine learning. In that class, she has a “semi-permissive” policy on AI — her students can use AI tools to compare outcomes to what they would write on their own from scratch.

She said she sees AI as a “smarter Google.”

“It has capabilities that can be tuned to make it behave better than Google, but at the end of the day, it’s still a tool,” Banerjee said. “It’s not built in a way that it can be used right away.”

Sinclair Community College communications department chair David Bodary, who teaches public speaking, has shown his classes how to use AI to arrange a speech or brainstorm topics.

“What I’m trying to get them to understand is that they have an ethical responsibility for the accuracy of the information, the integrity of the information and for their process,” Bodary said.

He said he’s still trying to make students understand they must think critically about the speech. While the AI tool could show a student how to open a speech, for example, the student needs to decide the best way to present their topic.

The University of Dayton has also been looking into the ways AI can be used and misused.

“We are considering not only how we prepare our students to engage in a world using AI for their careers, but we also are looking at the complex issues regarding data governance, personal privacy and security,” UD officials said in a statement.

About the Author