Revealed: Thousands of UK university students caught cheating using AI
-
Maybe we need a new way to approach school. I don't think I agree with turning education into a competition where the difficulty is curved towards the most competitive creating a system that became so difficult that students need to edge each other out any way they can.
I guess what I don’t understand is what changed? Is everything homework now? When I was in school, even college, a significant percentage of learning was in class work, pop quizzes, and weekly closed book tests. How are these kids using LLMs so much for class if a large portion of the work is still in the classroom? Or is that just not the case anymore? It’s not like ChatGPT can handwrite an essay in pencil or give an in person presentation (yet).
-
Have you seen the size of these classrooms? It's not uncommon for lecture halls to seat 200+ students. You're thinking that each student is going to present? Are they all going to create a presentation for each piece of info they learn? 200 presentations a day every day? Or are they each going to present one thing? What does a student do during the other 199 presentations? When does the teacher (the expert in the subject) provide any value in this learning experience?
There's too much to learn to have people only learning by presenting.
Have you seen the cost of tuition? Hire more professors and smaller classes.
Anyways, undergrad isn’t even that important in the grand scheme of things. Let people cheat and let that show when they apply for entry level jobs or higher education. If they can be successful after cheating in undergrad, then does it even matter?
When you get to grad school and beyond is what really matters. Speaking from a US perspective.
-
Actually caught, or caught with a "ai detection" software?
Actually caught. That's why it's tip of the iceberg, all the cases that were not caught.
-
Have you seen the cost of tuition? Hire more professors and smaller classes.
Anyways, undergrad isn’t even that important in the grand scheme of things. Let people cheat and let that show when they apply for entry level jobs or higher education. If they can be successful after cheating in undergrad, then does it even matter?
When you get to grad school and beyond is what really matters. Speaking from a US perspective.
"Let them cheat"
I mean, yeah, that's one way to go. You could say "the students who cheat are only cheating themselves" as well. And you'd be half right about that.
I see most often that there are two reasons that we see articles from professors who are waving the warning flags. First is that these students aren't just cheating themselves. There are only so many spots available for post-grad work or jobs that require a degree. Folks who are actually putting the time into learning the material are being drowned in a sea of folks who have gotten just as far without doing so.
And the second reason I think is more important. Many of these professors have dedicated their lives to teaching their subject to the next generation. They want to help others learn. That is being compromised by a massively disruptive technology. the article linked here provides evidence of that, and therefore deserves more than just a casual "teach better! the tech isn't going away"
-
Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg
Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.
A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.
Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.
The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.
we're doomed
-
Surprise motherfuckers. Maybe don't give grant money to LLM snakeoil fuckers, and maybe don't allow mass for-profit copyright violations.
So is it snake oil, or dangerously effective (to the point it enables evil)?
-
So is it snake oil, or dangerously effective (to the point it enables evil)?
it is snake oil in the sense that it is being sold as "AI", which it isn't. It is dangerous because LLMs can be used for targeted manipulation of millions if not billions of people.
-
it is snake oil in the sense that it is being sold as "AI", which it isn't. It is dangerous because LLMs can be used for targeted manipulation of millions if not billions of people.
Yeah, I do worry about that. We haven't seen much in the way of propaganda bots or even LLM scams, but the potential is there.
Hopefully, people will learn to be skeptical they way they did with photoshopped photos, and not the way they didn't with where their data is going.
-
Actually caught. That's why it's tip of the iceberg, all the cases that were not caught.
The article does not state that. It does, however, mention that AI detection tools were used, and that they failed to detect AI writing 90 something % of the time. It seems extremely likely they used ai detection software.
-
The article does not state that. It does, however, mention that AI detection tools were used, and that they failed to detect AI writing 90 something % of the time. It seems extremely likely they used ai detection software.
I'm saying this a someone that has worked for multiple institutions, raised hundreds of conduct cases and has more on the horizon.
The article says proven cases. Which means that the academic conduct case was not just raised but upheld. AI detection may have been used (there is a distinct lack of concencus between institutions on that) but would not be the only piece of evidence. Much like the use of Turnitin for plagiarism detection, it is an indication for further investigation but a case would not be raised based solely on a high tii score.
There are variations in process between institutions and they are changing their processes year on year in direct response to AI cheating. But being upheld would mean that there was direct evidence (prompt left in text), they admitted it in (I didn't know I wasn't allowed to, yes but I only, etc) and/or there was a viva and based on discussion with the student it was clear that they did not know the material.
It is worth mentioning that in a viva it is normally abundantly clear if a given student did/didn't write the material. When it is not clear, then (based on the institutions I have experience with) universities are very cautious and will give the students the benefit of the doubt (hence tip of iceberg).
-
I guess what I don’t understand is what changed? Is everything homework now? When I was in school, even college, a significant percentage of learning was in class work, pop quizzes, and weekly closed book tests. How are these kids using LLMs so much for class if a large portion of the work is still in the classroom? Or is that just not the case anymore? It’s not like ChatGPT can handwrite an essay in pencil or give an in person presentation (yet).
University was always guided self-learning, at least in the UK. The lecturers are not teachers. The provide and explain material, but they're not there to hand-hold you through it.
University education is very different to what goes on at younger ages. It has to be when a class is 300 rather than 30 people.
-
University was always guided self-learning, at least in the UK. The lecturers are not teachers. The provide and explain material, but they're not there to hand-hold you through it.
University education is very different to what goes on at younger ages. It has to be when a class is 300 rather than 30 people.
WTF? 300? There were barely 350 people in my graduating class of high school and that isn’t a small class for where I am from. The largest class size at my college was maybe 60. No wonder people use LLMs. Like, that’s just called an auditorium at that point, how could you even ask a question? Self-guided isn’t supposed to mean “solo”.
-
WTF? 300? There were barely 350 people in my graduating class of high school and that isn’t a small class for where I am from. The largest class size at my college was maybe 60. No wonder people use LLMs. Like, that’s just called an auditorium at that point, how could you even ask a question? Self-guided isn’t supposed to mean “solo”.
You can ask questions in auditorium classes.
The 300+ student courses typically were high volume courses like intro or freshman courses.
Second year cuts down significantly in class size, but also depends on the subject.
3rd and 4th year courses, in my experience, were 30-50 students
-
You can ask questions in auditorium classes.
The 300+ student courses typically were high volume courses like intro or freshman courses.
Second year cuts down significantly in class size, but also depends on the subject.
3rd and 4th year courses, in my experience, were 30-50 students
You can ask questions in auditorium classes.
I am going to be honest; I don’t believe you. I genuinely don’t believe that in a class with more people than minutes in the session that a person could legitimately have time to interact with the professor.
The 60 person class I referred to was a required lecture portion freshman science class with a smaller lab portion. That we could ask questions in the lab was the only reason 60 people was okay in the lecture and even then the professor said he felt it was too many people.
-
You can ask questions in auditorium classes.
I am going to be honest; I don’t believe you. I genuinely don’t believe that in a class with more people than minutes in the session that a person could legitimately have time to interact with the professor.
The 60 person class I referred to was a required lecture portion freshman science class with a smaller lab portion. That we could ask questions in the lab was the only reason 60 people was okay in the lecture and even then the professor said he felt it was too many people.
That’s fine if you don’t, but you can ask questions.
They even have these clickers that allow the professor to ask “snap questions” with multiple choice answers so they can check understanding
-
That’s fine if you don’t, but you can ask questions.
They even have these clickers that allow the professor to ask “snap questions” with multiple choice answers so they can check understanding
I can’t believe people go into debt for that experience. I would be livid.
-
Should be expelled and banned for life.
Everyone who's not religious cheats though.
-
Everyone who's not religious cheats though.
I'm the most atheistic person you're gonna meet, and no, I don't cheat.
-
I'm the most atheistic person you're gonna meet, and no, I don't cheat.
Maybe not with AI but in a different way. Also exceptions obviously exist.
Anyways enough of boosting your stats.
-
Maybe not with AI but in a different way. Also exceptions obviously exist.
Anyways enough of boosting your stats.
Also not without ML. Especially in university you don't learn out of obligation anymore, you learn to actually acquire new skills. As soon as you let someone else or an LLM do that, you're very clearly showing that you don't actually have any interest in learning, but only in succeeding. Which is very fucking worthless, and dangerous.