Thoughts on integrating GenAI in education at bachelor’s level

Sharing some thoughts, based on a discussion with colleagues, on how generative AI should (and should not) be intergrated into teaching at bachelor’s degree level.

First, in my opinion, teachers should have the final authority to decide if and how AI will be used in their courses. University can (and should) give guidelines, but it should not force teachers to integrate GenAI into their courses, especially in specific ways.

On the other hand, it’s part of teachers’ professional development to learn about these methods and think how they should (and should not) be used in education.

One issue here is that, if I’ve understood correctly, some teachers in Finland accept the use of DeepL. This means that they tell students it’s ok to write in Finnish and then use that tool to translate the text in AI-generated English. This is for courses in English.

I don’t approve of that, and not allowing AI-generated translations in my courses. The students need to learn to express themselves in English and if they take an English-language course, they need to write their own English.

I also don’t buy the idea that “I did the thinking, AI just presents my ideas”. I don’t believe you can separate thinking from writing to an adequate extent. Writing is thinking.

Furthermore, I disagree with Ethan Mollick that AI-generated text cannot be identified. I think he focuses on the wrong premise which is that “accuracy won’t be 100%”. That premise is true (it’s true for all prediction), but I’ve seen in countless of practical cases that students pretty much use the standard copy-paste ChatGPT text which is really easy for a trained eye to detect.

When I detect such text, I approach the student privately and ask if they were using AI. Many of them admit to using it! I then give them a chance to redo. Some don’t admit and if they give a suspicious explanation as to how their text was created or if I still think the text is way too close to typical AI-generated text, I propose we have a meeting on the topic where we’ll discuss on the subject matter in real-time. A couple of times, the story has changed after this and all of a sudden they admit to using AI, though they never claim they violated the course rules on purpose (I clearly mention every time no AI-generated text is not allowed, at all…). At this point, I let them keep their face, give them a proverbial lecture on why they shouldn’t let AI do their thinking, and ask them to redo the work.

Overall, when it comes to writing, I’m actually very anti-AI at the moment. I’m very pro-AI when it comes to data analysis and learning. Though in data analysis, one needs to know the method first themselves, because AI makes mistakes and if you don’t understand the method, you won’t spot these. I’ve run machine learning, statistical testing, qualitative analysis using AI and have seen mistakes in each, yet have seen very good results, too.

AI for bachelor’s level students is very dangerous because these students are only in the process of developing basic skills. So, unless they use AI in the way that supports their skill development, they are using it wrongly, in my opinion. Advanced users like me can use it more extensively, because we have the basic skills that help us identify and fix issues.

Currently, it seems to me that any use of GenAI needs to come after we can be sure that students already possess the basic skills they want to use GenAI for. For example, if the student already knows how to write good English, the use for writing is more acceptable (though not in a way the output would be directly copied). If they already know the basics of statistical testing, they can use GenAI for statistical testing. And so on. Otherwise, they just use to replace their lack of skills and they’ll never learn the skills.

Another use case is to learn the skills like writing or data analysis by using GenAI. But that requires a specific approach of questioning, reflection, doing yourself, etc. This proces is more laborious than simply copying GenAI’s outputs, which is why most students wouldn’t do it. They also don’t necessarily know how to probe or question GenAI in a way that supports their learning.

To this end, special tools or systems for pedagogical purposes can be developed. Cipherbot (https://cipherbot.qcri.org) is one example of such systems. It allows the teacher to create a course, upload specific learning materials. After this, students can ask Cipherbot questions about the learning material, and Cipherbot recommends additional questions that support the student’s learning. This way, the learning is personalized and students can proceed at their own pace.