Breadcrumb
- Home
- Publications
- Proceedings
- 2025 AIChE Annual Meeting
- Education Division
- AI in Chemical Engineering
- (262f) Implementation of an MEB Novice Chatbot to Develop Critical Thinking Skills in Chemical Engineering
In our Material and Energy Balances (MEB) course–the first technical chemical engineering course our sophomores and transferring juniors take–we have began to integrate a specifically-trained AI chatbot into the course to help target one of the course’s learning objectives: for students to be able to critique solutions and determine the qualities of stronger proposals. We have previously targeted this objective through case-based activities, group discussions, and peer review; and saw an opportunity for AI to further support these activities. In Fall 2024, we developed our MEB Novice chatbot–a customized retrieval-augmented generator (RAG) chatbot built on ChatGPT–by providing the AI with all of the course materials; and initializing the chatbot to give confident, but generally errant answers in eight misconceptions that the instructors have found students typically have in MEB. This is opposite to how many other instructors have employed chatbots in their teaching (e.g., as automated tutors or experts), and, as far as the authors were aware at the time, a novel application of an intentionally errant AI-chatbot in engineering education.
Within the class, students are tasked with interacting with the MEB Novice chatbot as part of their homework, where they are tasked with critiquing the proposed solutions given by the chatbot: writing up whether they agree with the solution, and explaining why they did so. The students are also able to interact with the MEB Novice chatbot free of charge throughout the course outside of these activities. We see these interactions as rehearsals for conversations our students will have with future collaborators in engineering.
In our presentation, we will share background on the development of the MEB Novice chatbot and its testing using a mini-concept inventory to tune the chatbot to an adequate level of correctness. We use student pre-course responses to test our concept inventory–to check to see whether students had these misconceptions at the beginning of the course–and we compare chatbot responses with student pre- and post-course responses as a means of measuring our target level of accuracy of the chatbot. We will also present how students interacted with the chatbot–both in assigned and unassigned interactions with students, for which students had asked the chatbot over 1000 questions in Fall 2024–and survey data relating to student experiences with the tool. Finally, we will share some initial survey data relating to critical thinking skills targeted for development through the integration of the MEB Novice Chatbot within the course, and our lessons learned in using AI in our classroom. By sharing our experiences, we hope to encourage our colleagues to try integrating AI-powered techniques within their own classrooms, and to provide an example of how they may be able to use and evaluate students using these tools.