The Problem With How Most Students Use AI
Most students use ChatGPT the wrong way. They paste in an essay question, copy the output, and submit it. That's not studying — that's outsourcing your education to a language model. And beyond the academic integrity issues, it doesn't work. You won't remember it, you won't understand it, and it will catch up with you in the exam.
I use ChatGPT differently. I use it as a tutor — a patient, always-available tutor who never makes me feel stupid for asking the same question five times.
The Socratic Method Prompt
The most powerful thing I've discovered is asking ChatGPT to teach me using the Socratic method. Instead of asking "explain the citric acid cycle," I ask: "I'm trying to understand the citric acid cycle. Don't explain it to me directly — ask me questions to help me figure it out myself. Start with what I should already know and build from there."
This forces active recall. You're not passively reading an explanation — you're constructing the knowledge yourself, with the AI guiding you toward the right answers. The research on active recall is unambiguous: it produces far better long-term retention than passive reading.
The Feynman Technique, Automated
The Feynman Technique is a learning method where you explain a concept in simple language as if teaching it to someone with no background. The idea is that if you can't explain it simply, you don't understand it.
I use ChatGPT to implement this at scale. My workflow: I write a plain-English explanation of a concept I've just studied, then ask ChatGPT to identify gaps, misconceptions, or oversimplifications in my explanation. It's remarkably good at this. It will tell me exactly where my understanding breaks down.
Generating Practice Questions
Another use case: generating exam-style questions. I give ChatGPT my lecture notes or a topic area and ask it to generate 10 MCQs at first-year university level, including explanations for why each wrong answer is wrong. This is genuinely useful — the explanations for wrong answers often teach you more than the correct answers.
The Ethical Line
I want to be clear about where I draw the line. I use AI to understand material, generate practice questions, and test my own knowledge. I don't use it to write assignments, generate essays, or produce any work that I submit as my own. That's not just an ethical position — it's a practical one. The skills I'm building now will matter when I'm in a lab or a clinical setting, and no AI will be doing my thinking for me then.
The Bottom Line
AI is a tool. Like any tool, its value depends entirely on how you use it. Used well, ChatGPT can make you a significantly better student. Used badly, it can give you the illusion of learning while you fall further behind. The difference is whether you're using it to think harder or to avoid thinking altogether.
Filed under

Written by
Stephen Kelechi Imo
Biomedical Science Student · Coventry University
First-year Biomedical Science student at Coventry University, writing about AI tools, student life, and the science of staying productive. Originally from Nigeria, now navigating UK university life — one lab session at a time.
Read more about Stephen →