Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. World News
  3. Revealed: Thousands of UK university students caught cheating using AI

Revealed: Thousands of UK university students caught cheating using AI

Scheduled Pinned Locked Moved World News
world
65 Posts 34 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • R [email protected]

    Chatgpt output isn't crap anymore. I teach introductory physics at a university and require fully written out homework, showing math steps, to problems that I've written. I wrote my own homework many years ago when chegg blew up and all major textbook problems were on chegg.

    Just two years ago, chatgpt wasn't so great at intro physics and math. It's pretty good now, and shows all the necessary steps to get the correct answer.

    I do not grade my homework on correctness. Students only need to show me effort that they honestly attempted each problem for full credit. But it's way quicker for students to simply upload my homework pdf to chatgpt and copy down the output than give it their own attempt.

    Of course, doing this results in poor exam performance. Anecdotally, my exams from my recent fall semester were the lowest they've ever been. I put two problems on my final that directly came from from my homework, one of them being the problem that made me realize roughly 75% of my class was chatgpt'ing all the homework as chatgpt isn't super great at reading angles from figures, and it's like these students had never even seen a problem like it before.

    I'm not completely against the use of AI for my homework. It could be like a tutor that students ask questions to when stuck. But unfortunately that takes more effort than simply typing "solve problems 1 through 5, showing all steps, from this document" into chatgpt.

    taiatari@lemmynsfw.comT This user is from outside of this forum
    taiatari@lemmynsfw.comT This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #10

    Personally, I think we have homework the wrong way around. Instead of teaching the subject in class and then assign practice for home, we should be learn the subject at home and so the practice in class.

    I always found it easier to read up on something, get an idea of a concept by my self. But when trying to solve the problems I ran into questions, but no one was there I could ask. If the problem were to be solved in class I could ask fellow students or the teacher.

    Plus if the kids want to learn the concept from ChatGPT or Wikipedia that's fine by me as long as they learn it somehow.

    Of course this does not apply to all concepts, subjects and such but as a general rule I think it works.

    R 1 Reply Last reply
    0
    • taiatari@lemmynsfw.comT [email protected]

      Personally, I think we have homework the wrong way around. Instead of teaching the subject in class and then assign practice for home, we should be learn the subject at home and so the practice in class.

      I always found it easier to read up on something, get an idea of a concept by my self. But when trying to solve the problems I ran into questions, but no one was there I could ask. If the problem were to be solved in class I could ask fellow students or the teacher.

      Plus if the kids want to learn the concept from ChatGPT or Wikipedia that's fine by me as long as they learn it somehow.

      Of course this does not apply to all concepts, subjects and such but as a general rule I think it works.

      R This user is from outside of this forum
      R This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #11

      This is mostly the purpose of my homework. I assign daily homework. I don't expect students to get the correct answers but instead attempt them and then come to class with questions. My lectures are typically short so that i can dedicate class time to solving problems and homework assignments.

      I always open my class with "does anyone have any questions on the homework?". Prior chatgpt, students would ask me to go through all the homework, since much of my homework is difficult. Last semester though, with so many students using chatgpt, they rarely asked me about the homework... I would often follow up with "Really? No questions at all?"

      L 1 Reply Last reply
      0
      • flamekebab@piefed.socialF [email protected]

        Seems like an awful lot of debt to go into for something that's really not that valuable. If the certificate is the goal then a masters or PhD will end up being what's needed and faking your way through undergrad won't do much good.

        E This user is from outside of this forum
        E This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #12

        This is a story ask about the UK, not the US, though I imagine the situation is similar.

        flamekebab@piefed.socialF 1 Reply Last reply
        0
        • E [email protected]

          This is a story ask about the UK, not the US, though I imagine the situation is similar.

          flamekebab@piefed.socialF This user is from outside of this forum
          flamekebab@piefed.socialF This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #13

          I don't understand what point you're trying to make. I know it's about the UK..?

          E 1 Reply Last reply
          0
          • flamekebab@piefed.socialF [email protected]

            I don't understand what point you're trying to make. I know it's about the UK..?

            E This user is from outside of this forum
            E This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #14

            Who's going into debt to be at university in the UK?

            flamekebab@piefed.socialF 1 Reply Last reply
            0
            • microwave@lemmy.worldM [email protected]

              Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

              Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

              A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

              Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

              The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

              venusaur@lemmy.worldV This user is from outside of this forum
              venusaur@lemmy.worldV This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #15

              If ChatGPT can effectively do the work for you, then is it really necessary to do the work? Nobody saying to go to the library and find a book instead of letting a search engine do the work for you. Education has to evolve and so does the testing. A lot of things GPT’s can’t do well. Grade on that.

              M 1 Reply Last reply
              0
              • venusaur@lemmy.worldV [email protected]

                If ChatGPT can effectively do the work for you, then is it really necessary to do the work? Nobody saying to go to the library and find a book instead of letting a search engine do the work for you. Education has to evolve and so does the testing. A lot of things GPT’s can’t do well. Grade on that.

                M This user is from outside of this forum
                M This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #16

                The "work" that LLMs are doing here is "being educated".

                Like, when a prof says "read this book and write paper answering these questions", they aren't doing that because the world needs another paper written. They are inviting the student to go on a journey, one that is designed to change the person who travels that path.

                venusaur@lemmy.worldV 1 Reply Last reply
                0
                • microwave@lemmy.worldM [email protected]

                  Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

                  Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

                  A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

                  Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

                  The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

                  R This user is from outside of this forum
                  R This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #17

                  Surprise motherfuckers. Maybe don't give grant money to LLM snakeoil fuckers, and maybe don't allow mass for-profit copyright violations.

                  C 1 Reply Last reply
                  0
                  • M [email protected]

                    The "work" that LLMs are doing here is "being educated".

                    Like, when a prof says "read this book and write paper answering these questions", they aren't doing that because the world needs another paper written. They are inviting the student to go on a journey, one that is designed to change the person who travels that path.

                    venusaur@lemmy.worldV This user is from outside of this forum
                    venusaur@lemmy.worldV This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #18

                    Education needs to change too. Have students do something hands on.

                    W 1 Reply Last reply
                    0
                    • microwave@lemmy.worldM [email protected]

                      Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

                      Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

                      A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

                      Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

                      The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

                      M This user is from outside of this forum
                      M This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #19

                      Actually caught, or caught with a "ai detection" software?

                      P 1 Reply Last reply
                      5
                      • venusaur@lemmy.worldV [email protected]

                        Education needs to change too. Have students do something hands on.

                        W This user is from outside of this forum
                        W This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #20

                        Hands on, like engage with prior material on the subject and formulate complex ideas based on that...?

                        Sarcasm aside, asking students to do something in lab often requires them to have gained an understanding of the material so they can do something, an understanding they utterly lack if they use AI to do their work. Although tbf this lack of understanding in-person is really the #1 way we catch students who are using AI.

                        venusaur@lemmy.worldV 1 Reply Last reply
                        0
                        • W [email protected]

                          Hands on, like engage with prior material on the subject and formulate complex ideas based on that...?

                          Sarcasm aside, asking students to do something in lab often requires them to have gained an understanding of the material so they can do something, an understanding they utterly lack if they use AI to do their work. Although tbf this lack of understanding in-person is really the #1 way we catch students who are using AI.

                          venusaur@lemmy.worldV This user is from outside of this forum
                          venusaur@lemmy.worldV This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #21

                          Class discussion. Live presentations with question and answer. Save papers for supplementing hands on research.

                          M 1 Reply Last reply
                          0
                          • venusaur@lemmy.worldV [email protected]

                            Class discussion. Live presentations with question and answer. Save papers for supplementing hands on research.

                            M This user is from outside of this forum
                            M This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #22

                            Have you seen the size of these classrooms? It's not uncommon for lecture halls to seat 200+ students. You're thinking that each student is going to present? Are they all going to create a presentation for each piece of info they learn? 200 presentations a day every day? Or are they each going to present one thing? What does a student do during the other 199 presentations? When does the teacher (the expert in the subject) provide any value in this learning experience?

                            There's too much to learn to have people only learning by presenting.

                            venusaur@lemmy.worldV 1 Reply Last reply
                            0
                            • M [email protected]

                              Maybe we need a new way to approach school. I don't think I agree with turning education into a competition where the difficulty is curved towards the most competitive creating a system that became so difficult that students need to edge each other out any way they can.

                              A This user is from outside of this forum
                              A This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #23

                              I guess what I don’t understand is what changed? Is everything homework now? When I was in school, even college, a significant percentage of learning was in class work, pop quizzes, and weekly closed book tests. How are these kids using LLMs so much for class if a large portion of the work is still in the classroom? Or is that just not the case anymore? It’s not like ChatGPT can handwrite an essay in pencil or give an in person presentation (yet).

                              W 1 Reply Last reply
                              6
                              • M [email protected]

                                Have you seen the size of these classrooms? It's not uncommon for lecture halls to seat 200+ students. You're thinking that each student is going to present? Are they all going to create a presentation for each piece of info they learn? 200 presentations a day every day? Or are they each going to present one thing? What does a student do during the other 199 presentations? When does the teacher (the expert in the subject) provide any value in this learning experience?

                                There's too much to learn to have people only learning by presenting.

                                venusaur@lemmy.worldV This user is from outside of this forum
                                venusaur@lemmy.worldV This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #24

                                Have you seen the cost of tuition? Hire more professors and smaller classes.

                                Anyways, undergrad isn’t even that important in the grand scheme of things. Let people cheat and let that show when they apply for entry level jobs or higher education. If they can be successful after cheating in undergrad, then does it even matter?

                                When you get to grad school and beyond is what really matters. Speaking from a US perspective.

                                M L 2 Replies Last reply
                                0
                                • M [email protected]

                                  Actually caught, or caught with a "ai detection" software?

                                  P This user is from outside of this forum
                                  P This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #25

                                  Actually caught. That's why it's tip of the iceberg, all the cases that were not caught.

                                  S 1 Reply Last reply
                                  1
                                  • venusaur@lemmy.worldV [email protected]

                                    Have you seen the cost of tuition? Hire more professors and smaller classes.

                                    Anyways, undergrad isn’t even that important in the grand scheme of things. Let people cheat and let that show when they apply for entry level jobs or higher education. If they can be successful after cheating in undergrad, then does it even matter?

                                    When you get to grad school and beyond is what really matters. Speaking from a US perspective.

                                    M This user is from outside of this forum
                                    M This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #26

                                    "Let them cheat"

                                    I mean, yeah, that's one way to go. You could say "the students who cheat are only cheating themselves" as well. And you'd be half right about that.

                                    I see most often that there are two reasons that we see articles from professors who are waving the warning flags. First is that these students aren't just cheating themselves. There are only so many spots available for post-grad work or jobs that require a degree. Folks who are actually putting the time into learning the material are being drowned in a sea of folks who have gotten just as far without doing so.

                                    And the second reason I think is more important. Many of these professors have dedicated their lives to teaching their subject to the next generation. They want to help others learn. That is being compromised by a massively disruptive technology. the article linked here provides evidence of that, and therefore deserves more than just a casual "teach better! the tech isn't going away"

                                    L 1 Reply Last reply
                                    0
                                    • microwave@lemmy.worldM [email protected]

                                      Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

                                      Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

                                      A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

                                      Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

                                      The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

                                      D This user is from outside of this forum
                                      D This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #27

                                      we're doomed

                                      jacksonlamb@lemmy.worldJ 1 Reply Last reply
                                      0
                                      • R [email protected]

                                        Surprise motherfuckers. Maybe don't give grant money to LLM snakeoil fuckers, and maybe don't allow mass for-profit copyright violations.

                                        C This user is from outside of this forum
                                        C This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #28

                                        So is it snake oil, or dangerously effective (to the point it enables evil)?

                                        R 1 Reply Last reply
                                        1
                                        • C [email protected]

                                          So is it snake oil, or dangerously effective (to the point it enables evil)?

                                          R This user is from outside of this forum
                                          R This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #29

                                          it is snake oil in the sense that it is being sold as "AI", which it isn't. It is dangerous because LLMs can be used for targeted manipulation of millions if not billions of people.

                                          C O 2 Replies Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups