If you’re a student using ChatGPT for other AI for your research paper, please read this carefully – it could save your bacon.
If you’re a teacher struggling to convince your students of the perils of AI-driven research, this post might help.
A few weeks ago I was writing about cultural differences in morality. I spent hours trying to find a simple study that investigated cultural differences to answers on the trolley problem. In my desperation I turned to ChatGPT. Knowing that it makes stuff up when it doesn’t know, I was very clear in my instructions:
- “Can you give me a real study that actually happened that shows differences in moral decision-making between eastern adn (sic) western cultures?”
The image below is what it gave me. It’s so convincing. The problem is…this study doesn’t exist.
ChatGPT has the same philosophy as my older brother – “if you don’t know something just say it with enough authority and people will believe you.” I searched and searched for this study, but it doesn’t exit. The closest I could find was this one….
If I never bothered to fact-check ChatGPT, I might have written about something that never happened. This solidified for me the limitations of AI for research. If I have to double-check everything in Google Scholar anyway, why not just start there? Most of my students learn Google Scholar is faster and better in the long-run. It does take time, but when they found the study they’ve been hunting for they get to experience the most rewarding thing in research – the joy of academic discovery!
I’m pleased to say Google Scholar finally trumped ChatGPT and I was able to find a simple yet fascinating study that compared British and American participants’ responses to the trolley problem. It showed Chinese participants preferred to do nothing, maybe because of the importance of fate in Chinese culture (Link). A simple example of a sociocultural factor in moral development – being raised with different cultural values will influence moral decision-making.
Can students use AI for research?
Before my students start a research project I give clear instructions on appropriate and inappropriate uses of AI. My guidance is to use it like google – it can tell you where to find information, but should not be a source of information. I also remind them that we’re trainee psychologists learning how to do the research that discovers knowledge that goes into ChatGPT for other people, we should not be the ones taking the information from ChatGPT. If they as the next generation of psychologists don’t learn psychology, AI can’t possibly learn anything new. In this way, I try to get them to understand the deeper reason why they can’t use AI for information in my class.
Can students use AI for writing?
The IB has harsh penalties for academic dishonesty, including plagiarism – the act of presenting the words, ideas, or images of another as your own. Most students don’t realise that using AI to do their writing for them is plagiarism. If I was a student and had taken ChatGPTs study above and copied it into my IA or EE and my teacher or examiner fact-checked the source, found it didn’t exist, it could jeopardise my entire IB diploma. Similarly, writing something, copying and pasting into ChatGPT and asking it to rewrite it is plagiarism – it’s presenting another’s words as if they’re your own. Using it for simple spelling and grammar checks is fine, rewriting whole sections is not. It’s no different to getting a friend or a tutor to rewrite your essay for you.
At least, this is my interpretation of academic dishonesty rules in the modern world of AI. I’m always open to being proven wrong.
Travis Dixon is an IB Psychology teacher, author, workshop leader, examiner and IA moderator.