Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. World News
  3. Revealed: Thousands of UK university students caught cheating using AI

Revealed: Thousands of UK university students caught cheating using AI

Scheduled Pinned Locked Moved World News
world
65 Posts 34 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • flamekebab@piefed.socialF [email protected]

    I don't understand what point you're trying to make. I know it's about the UK..?

    E This user is from outside of this forum
    E This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #14

    Who's going into debt to be at university in the UK?

    flamekebab@piefed.socialF 1 Reply Last reply
    0
    • microwave@lemmy.worldM [email protected]

      Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

      Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

      A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

      Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

      The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

      venusaur@lemmy.worldV This user is from outside of this forum
      venusaur@lemmy.worldV This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #15

      If ChatGPT can effectively do the work for you, then is it really necessary to do the work? Nobody saying to go to the library and find a book instead of letting a search engine do the work for you. Education has to evolve and so does the testing. A lot of things GPT’s can’t do well. Grade on that.

      M 1 Reply Last reply
      0
      • venusaur@lemmy.worldV [email protected]

        If ChatGPT can effectively do the work for you, then is it really necessary to do the work? Nobody saying to go to the library and find a book instead of letting a search engine do the work for you. Education has to evolve and so does the testing. A lot of things GPT’s can’t do well. Grade on that.

        M This user is from outside of this forum
        M This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #16

        The "work" that LLMs are doing here is "being educated".

        Like, when a prof says "read this book and write paper answering these questions", they aren't doing that because the world needs another paper written. They are inviting the student to go on a journey, one that is designed to change the person who travels that path.

        venusaur@lemmy.worldV 1 Reply Last reply
        0
        • microwave@lemmy.worldM [email protected]

          Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

          Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

          A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

          Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

          The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

          R This user is from outside of this forum
          R This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #17

          Surprise motherfuckers. Maybe don't give grant money to LLM snakeoil fuckers, and maybe don't allow mass for-profit copyright violations.

          C 1 Reply Last reply
          0
          • M [email protected]

            The "work" that LLMs are doing here is "being educated".

            Like, when a prof says "read this book and write paper answering these questions", they aren't doing that because the world needs another paper written. They are inviting the student to go on a journey, one that is designed to change the person who travels that path.

            venusaur@lemmy.worldV This user is from outside of this forum
            venusaur@lemmy.worldV This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #18

            Education needs to change too. Have students do something hands on.

            W 1 Reply Last reply
            0
            • microwave@lemmy.worldM [email protected]

              Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

              Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

              A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

              Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

              The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

              M This user is from outside of this forum
              M This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #19

              Actually caught, or caught with a "ai detection" software?

              P 1 Reply Last reply
              5
              • venusaur@lemmy.worldV [email protected]

                Education needs to change too. Have students do something hands on.

                W This user is from outside of this forum
                W This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #20

                Hands on, like engage with prior material on the subject and formulate complex ideas based on that...?

                Sarcasm aside, asking students to do something in lab often requires them to have gained an understanding of the material so they can do something, an understanding they utterly lack if they use AI to do their work. Although tbf this lack of understanding in-person is really the #1 way we catch students who are using AI.

                venusaur@lemmy.worldV 1 Reply Last reply
                0
                • W [email protected]

                  Hands on, like engage with prior material on the subject and formulate complex ideas based on that...?

                  Sarcasm aside, asking students to do something in lab often requires them to have gained an understanding of the material so they can do something, an understanding they utterly lack if they use AI to do their work. Although tbf this lack of understanding in-person is really the #1 way we catch students who are using AI.

                  venusaur@lemmy.worldV This user is from outside of this forum
                  venusaur@lemmy.worldV This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #21

                  Class discussion. Live presentations with question and answer. Save papers for supplementing hands on research.

                  M 1 Reply Last reply
                  0
                  • venusaur@lemmy.worldV [email protected]

                    Class discussion. Live presentations with question and answer. Save papers for supplementing hands on research.

                    M This user is from outside of this forum
                    M This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #22

                    Have you seen the size of these classrooms? It's not uncommon for lecture halls to seat 200+ students. You're thinking that each student is going to present? Are they all going to create a presentation for each piece of info they learn? 200 presentations a day every day? Or are they each going to present one thing? What does a student do during the other 199 presentations? When does the teacher (the expert in the subject) provide any value in this learning experience?

                    There's too much to learn to have people only learning by presenting.

                    venusaur@lemmy.worldV 1 Reply Last reply
                    0
                    • M [email protected]

                      Maybe we need a new way to approach school. I don't think I agree with turning education into a competition where the difficulty is curved towards the most competitive creating a system that became so difficult that students need to edge each other out any way they can.

                      A This user is from outside of this forum
                      A This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #23

                      I guess what I don’t understand is what changed? Is everything homework now? When I was in school, even college, a significant percentage of learning was in class work, pop quizzes, and weekly closed book tests. How are these kids using LLMs so much for class if a large portion of the work is still in the classroom? Or is that just not the case anymore? It’s not like ChatGPT can handwrite an essay in pencil or give an in person presentation (yet).

                      W 1 Reply Last reply
                      6
                      • M [email protected]

                        Have you seen the size of these classrooms? It's not uncommon for lecture halls to seat 200+ students. You're thinking that each student is going to present? Are they all going to create a presentation for each piece of info they learn? 200 presentations a day every day? Or are they each going to present one thing? What does a student do during the other 199 presentations? When does the teacher (the expert in the subject) provide any value in this learning experience?

                        There's too much to learn to have people only learning by presenting.

                        venusaur@lemmy.worldV This user is from outside of this forum
                        venusaur@lemmy.worldV This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #24

                        Have you seen the cost of tuition? Hire more professors and smaller classes.

                        Anyways, undergrad isn’t even that important in the grand scheme of things. Let people cheat and let that show when they apply for entry level jobs or higher education. If they can be successful after cheating in undergrad, then does it even matter?

                        When you get to grad school and beyond is what really matters. Speaking from a US perspective.

                        M L 2 Replies Last reply
                        0
                        • M [email protected]

                          Actually caught, or caught with a "ai detection" software?

                          P This user is from outside of this forum
                          P This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #25

                          Actually caught. That's why it's tip of the iceberg, all the cases that were not caught.

                          S 1 Reply Last reply
                          1
                          • venusaur@lemmy.worldV [email protected]

                            Have you seen the cost of tuition? Hire more professors and smaller classes.

                            Anyways, undergrad isn’t even that important in the grand scheme of things. Let people cheat and let that show when they apply for entry level jobs or higher education. If they can be successful after cheating in undergrad, then does it even matter?

                            When you get to grad school and beyond is what really matters. Speaking from a US perspective.

                            M This user is from outside of this forum
                            M This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #26

                            "Let them cheat"

                            I mean, yeah, that's one way to go. You could say "the students who cheat are only cheating themselves" as well. And you'd be half right about that.

                            I see most often that there are two reasons that we see articles from professors who are waving the warning flags. First is that these students aren't just cheating themselves. There are only so many spots available for post-grad work or jobs that require a degree. Folks who are actually putting the time into learning the material are being drowned in a sea of folks who have gotten just as far without doing so.

                            And the second reason I think is more important. Many of these professors have dedicated their lives to teaching their subject to the next generation. They want to help others learn. That is being compromised by a massively disruptive technology. the article linked here provides evidence of that, and therefore deserves more than just a casual "teach better! the tech isn't going away"

                            L 1 Reply Last reply
                            0
                            • microwave@lemmy.worldM [email protected]

                              Guardian investigation finds almost 7,000 proven cases of cheating – and experts says these are tip of the iceberg

                              Thousands of university students in the UK have been caught misusing ChatGPT and other artificial intelligence tools in recent years, while traditional forms of plagiarism show a marked decline, a Guardian investigation can reveal.

                              A survey of academic integrity violations found almost 7,000 proven cases of cheating using AI tools in 2023-24, equivalent to 5.1 for every 1,000 students. That was up from 1.6 cases per 1,000 in 2022-23.

                              Figures up to May suggest that number will increase again this year to about 7.5 proven cases per 1,000 students – but recorded cases represent only the tip of the iceberg, according to experts.

                              The data highlights a rapidly evolving challenge for universities: trying to adapt assessment methods to the advent of technologies such as ChatGPT and other AI-powered writing tools.

                              D This user is from outside of this forum
                              D This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #27

                              we're doomed

                              jacksonlamb@lemmy.worldJ 1 Reply Last reply
                              0
                              • R [email protected]

                                Surprise motherfuckers. Maybe don't give grant money to LLM snakeoil fuckers, and maybe don't allow mass for-profit copyright violations.

                                C This user is from outside of this forum
                                C This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #28

                                So is it snake oil, or dangerously effective (to the point it enables evil)?

                                R 1 Reply Last reply
                                1
                                • C [email protected]

                                  So is it snake oil, or dangerously effective (to the point it enables evil)?

                                  R This user is from outside of this forum
                                  R This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #29

                                  it is snake oil in the sense that it is being sold as "AI", which it isn't. It is dangerous because LLMs can be used for targeted manipulation of millions if not billions of people.

                                  C O 2 Replies Last reply
                                  0
                                  • R [email protected]

                                    it is snake oil in the sense that it is being sold as "AI", which it isn't. It is dangerous because LLMs can be used for targeted manipulation of millions if not billions of people.

                                    C This user is from outside of this forum
                                    C This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #30

                                    Yeah, I do worry about that. We haven't seen much in the way of propaganda bots or even LLM scams, but the potential is there.

                                    Hopefully, people will learn to be skeptical they way they did with photoshopped photos, and not the way they didn't with where their data is going.

                                    R 1 Reply Last reply
                                    0
                                    • P [email protected]

                                      Actually caught. That's why it's tip of the iceberg, all the cases that were not caught.

                                      S This user is from outside of this forum
                                      S This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #31

                                      The article does not state that. It does, however, mention that AI detection tools were used, and that they failed to detect AI writing 90 something % of the time. It seems extremely likely they used ai detection software.

                                      P 1 Reply Last reply
                                      3
                                      • S [email protected]

                                        The article does not state that. It does, however, mention that AI detection tools were used, and that they failed to detect AI writing 90 something % of the time. It seems extremely likely they used ai detection software.

                                        P This user is from outside of this forum
                                        P This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #32

                                        I'm saying this a someone that has worked for multiple institutions, raised hundreds of conduct cases and has more on the horizon.

                                        The article says proven cases. Which means that the academic conduct case was not just raised but upheld. AI detection may have been used (there is a distinct lack of concencus between institutions on that) but would not be the only piece of evidence. Much like the use of Turnitin for plagiarism detection, it is an indication for further investigation but a case would not be raised based solely on a high tii score.

                                        There are variations in process between institutions and they are changing their processes year on year in direct response to AI cheating. But being upheld would mean that there was direct evidence (prompt left in text), they admitted it in (I didn't know I wasn't allowed to, yes but I only, etc) and/or there was a viva and based on discussion with the student it was clear that they did not know the material.

                                        It is worth mentioning that in a viva it is normally abundantly clear if a given student did/didn't write the material. When it is not clear, then (based on the institutions I have experience with) universities are very cautious and will give the students the benefit of the doubt (hence tip of iceberg).

                                        1 Reply Last reply
                                        4
                                        • A [email protected]

                                          I guess what I don’t understand is what changed? Is everything homework now? When I was in school, even college, a significant percentage of learning was in class work, pop quizzes, and weekly closed book tests. How are these kids using LLMs so much for class if a large portion of the work is still in the classroom? Or is that just not the case anymore? It’s not like ChatGPT can handwrite an essay in pencil or give an in person presentation (yet).

                                          W This user is from outside of this forum
                                          W This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #33

                                          University was always guided self-learning, at least in the UK. The lecturers are not teachers. The provide and explain material, but they're not there to hand-hold you through it.

                                          University education is very different to what goes on at younger ages. It has to be when a class is 300 rather than 30 people.

                                          A 1 Reply Last reply
                                          7
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups