3

I am currently noticing that students are submitting essays where they have used ChatGPT to produce the text of their essay assignments. On some occasions, they have used it to correct the writing of the text. However, on other occasions, they have used it to answer the essay question directly. My idea is not to prohibit it but to teach them to work with it as a tool.

Do you have resources on this regard? For example, I recently found a teacher on Twitter who included in his instructions how students should use ChatGPT. In this regard, some of the caveats were aimed at citing if they have used ChatGPT in their essays and what kind of instructions they have used. Warning them that using it and not citing it can be considered plagiarism. Unfortunately, I didn't save the tweet.

Tito Sanz
  • 191
  • 5
  • 1
    Depends what level of use are you allowing the students. For example, would it be allowed to have another student provide the same service as Chat-GPT (correcting or providing paragraphs) if their name is listed in the acknowledgements? – dubious Apr 12 '23 at 12:15
  • @dubious If that student is helping to proofread the text there would be no problem. In this sense, the ideas and how they relate to the content of the subject matter are more important than the writing of the text itself. – Tito Sanz Apr 12 '23 at 13:26
  • 5
    At my university the program administrators are currently working on a combined guideline and policy document. I do not think I am allowed to share yet, but I suspect similar efforts will be ongoing in many places of higher education. – xLeitix Apr 12 '23 at 15:10
  • 3
    Some classes are about learning a subject matter, such as art history. In these classes, it's most important that students learn the material and can think about it critically, so I can see using Chat-GPT as a tool. But in other classes, especially writing classes, the subject is really about practicing a skill, one that will truly benefit students if they can handle it on command once they're out of school. In these situations, using Chat-GPT means not getting the practice, which means not getting the main benefit of taking the course.. – Kevin Apr 12 '23 at 17:01
  • 7
    Warning: ChatGPT makes up stuff. One journalist found out that according to ChatGPT, he was dead. Including a link to his obituary. Link didn't work Someone else found that he was guilty of sexual harassment. In a place where he had never been, a university that he never visited. Someone asked for a list of ten books related to some subject, got three real books, three books with the wrong author, and four books with made up but reasonably sounding title and author. – gnasher729 Apr 12 '23 at 17:40
  • @gnasher729, ... so maybe Tucker Carlson was an early, failed version of AI stuff? :) – paul garrett Apr 12 '23 at 18:23
  • 1
  • @Kevin that's the point. Honestly, in my classes, the concept and learning how to link different ideas are more important than the writing skill itself. – Tito Sanz Apr 15 '23 at 15:40

2 Answers2

4

There's a discussion around LLM/ChatGPT for programming courses. There are valuable nuggets from there you can adapt for essays.

You are on the right track, or rather, I'm in agreement with your approach of teach them to work with it as a tool.
In drafting a policy for conversational AI/LLM, I see engaging them more of learning scaffolding.
Students must disclose their use and how they are used. Students must take ownership and must show their creativity.

What we can and should do is getting the students on the path of purposive engagement as assistive tools, leveraging them for their critical appraisal and thinking.

Inbtw, there're tools that detect text/essays written by conversational AI. Openai, the ChatGPT 'creator', has one: AI Text Classifier

semmyk-research
  • 3,560
  • 1
  • 3
  • 23
  • As for the last sentence, note the statement from the linked website: "As of July 20, 2023, the AI classifier is no longer available due to its low rate of accuracy. We are working to incorporate feedback and are currently researching more effective provenance techniques for text, and have made a commitment to develop and deploy mechanisms that enable users to understand if audio or visual content is AI-generated." – lighthouse keeper Aug 28 '23 at 15:11
-4

I'm a lowly PhD student who's not yet formally taught a class, but here's my initial reaction: Chat GPT can be used for coding/data cleaning purposes (when necessary), and it can also be fine for spell checking. However, where I sort of would draw the line is them formally using it to write the paper. The paper should be the student's work, not Chat GPTs work.

I don't know how to check if chat GPT's been used, but that's my view on it. As someone who writes code for statistics purposes, GPT can be quite useful for a variety of problems (not that I've used it). But, it should not (nor should we expect it to) replace the human analysts, and that philosophy matters in the classroom too. In other words, it can be used as an awesome supplement to one's work-- never as a substitute. So, I think your idea is just fine, so long as you can check to see if it was used and how. The main thing to be concerned about, I think, is how to know if someone actually did use it, but did not disclose it. I'm sure people who have worked with it would have a better idea than I would.

Jared Greathouse
  • 1,967
  • 1
  • 12
  • 15
    Oh dear, do not use ChatGPT for data cleaning or anything where you are relying on the output. This is terrifying. – Bryan Krause Apr 12 '23 at 14:43
  • 2
    I should've worded it better: I wasn't ADVISING someone to do that. I was just saying that it shouldn't be banned. Like if I were grading an exam and someone said they used GPT for cleaning, I'd be like "Okay I guess, go ahead, but it messed something up (as is likely) that's on YOU, not the robot. – Jared Greathouse Apr 12 '23 at 15:13
  • 5
    If not banned, it should at least be disclosed so that everyone else can dismiss the work appropriately. I saw someone touting ChatGPT's efficacy in data cleaning at "85% accurate!" They thought this was a good thing. 85%. This is a toy, not a tool. – Bryan Krause Apr 12 '23 at 15:20
  • 1
    @BryanKrause, maybe not "dismiss", but "distrust"... – paul garrett Apr 12 '23 at 18:53
  • @paulgarrett Yes, that's a better word. – Bryan Krause Apr 12 '23 at 19:02
  • When asked why plagiarism is bad, this is one reason to be honest about authorship and not to plagiarise I always include this one: to which degree can I trust this text? – Captain Emacs Apr 13 '23 at 11:31