Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Lemmy Shitpost
  3. AGI achieved 🤖

AGI achieved 🤖

Scheduled Pinned Locked Moved Lemmy Shitpost
lemmyshitpost
142 Posts 69 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • T [email protected]

    I'm not involved in LLM, but apparently the way it works is that the sentence is broken into words and each word has assigned unique number and that's how the information is stored. So LLM never sees the actual word.

    C This user is from outside of this forum
    C This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #20

    Adding to this, each word and words around it are given a statistical percentage. In other words, what are the odds that word 1 and word 2 follow each other? You scale that out for each word in a sentence and you can see that LLMs are just huge math equations that put words together based on their statistical probability.

    This is key because, I can't emphasize this enough, AI does not think. We (humans) anamorphize them, giving them human characteristics when they are little more than number crunchers.

    1 Reply Last reply
    4
    • A [email protected]

      Agi lost

      xavier666@lemm.eeX This user is from outside of this forum
      xavier666@lemm.eeX This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #21

      Henceforth, AGI should be called "almost general intelligence"

      cyrano@lemmy.dbzer0.comC 1 Reply Last reply
      2
      • V [email protected]

        Biggest threat to humanity

        I This user is from outside of this forum
        I This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #22

        I know there’s no logic, but it’s funny to imagine it’s because it’s pronounced Mrs. Sippy

        J S merc@sh.itjust.worksM 3 Replies Last reply
        22
        • xavier666@lemm.eeX [email protected]

          Henceforth, AGI should be called "almost general intelligence"

          cyrano@lemmy.dbzer0.comC This user is from outside of this forum
          cyrano@lemmy.dbzer0.comC This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #23

          Happy cake day 🍰

          xavier666@lemm.eeX 1 Reply Last reply
          0
          • cyrano@lemmy.dbzer0.comC [email protected]
            This post did not contain any content.
            abfarid@startrek.websiteA This user is from outside of this forum
            abfarid@startrek.websiteA This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #24

            I get the meme aspect of this. But just to be clear, it was never fair to judge LLMs for specifically this. The LLM doesn't even see the letters in the words, as every word is broken down into tokens, which are numbers. I suppose with a big enough corpus of data it might eventually extrapolate which words have which letter from texts describing these words, but normally it shouldn't be expected.

            cyrano@lemmy.dbzer0.comC K abfarid@startrek.websiteA zacryon@feddit.orgZ 4 Replies Last reply
            5
            • cyrano@lemmy.dbzer0.comC [email protected]

              Happy cake day 🍰

              xavier666@lemm.eeX This user is from outside of this forum
              xavier666@lemm.eeX This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #25

              Thanks! Time really flies.

              1 Reply Last reply
              0
              • cyrano@lemmy.dbzer0.comC [email protected]
                This post did not contain any content.
                R This user is from outside of this forum
                R This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #26

                It's funny how people always quickly point out that an LLM wasn't made for this, and then continue to shill it for use cases it wasn't made for either (The "intelligence" part of AI, for starters)

                G B R underpantsweevil@lemmy.worldU softestsapphic@lemmy.worldS 6 Replies Last reply
                28
                • I [email protected]

                  I know there’s no logic, but it’s funny to imagine it’s because it’s pronounced Mrs. Sippy

                  J This user is from outside of this forum
                  J This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #27

                  And if it messed up on the other word, we could say because it’s pronounced Louisianer.

                  1 Reply Last reply
                  1
                  • cyrano@lemmy.dbzer0.comC [email protected]

                    Next step how many r in Lollapalooza

                    Q This user is from outside of this forum
                    Q This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #28

                    cyrano@lemmy.dbzer0.comC eager_eagle@lemmy.worldE 2 Replies Last reply
                    3
                    • Q [email protected]

                      cyrano@lemmy.dbzer0.comC This user is from outside of this forum
                      cyrano@lemmy.dbzer0.comC This user is from outside of this forum
                      [email protected]
                      wrote on last edited by [email protected]
                      #29

                      Try it with o3 maybe it needs time to think 😝

                      1 Reply Last reply
                      4
                      • L [email protected]

                        I don't get it

                        K This user is from outside of this forum
                        K This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #30

                        https://www.youtube.com/shorts/7pQrMAekdn4

                        1 Reply Last reply
                        0
                        • abfarid@startrek.websiteA [email protected]

                          I get the meme aspect of this. But just to be clear, it was never fair to judge LLMs for specifically this. The LLM doesn't even see the letters in the words, as every word is broken down into tokens, which are numbers. I suppose with a big enough corpus of data it might eventually extrapolate which words have which letter from texts describing these words, but normally it shouldn't be expected.

                          cyrano@lemmy.dbzer0.comC This user is from outside of this forum
                          cyrano@lemmy.dbzer0.comC This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #31

                          True and I agree with you yet we are being told all job are going to disappear, AGI is coming tomorrow, etc. As usual the truth is more balanced

                          1 Reply Last reply
                          7
                          • abfarid@startrek.websiteA [email protected]

                            I get the meme aspect of this. But just to be clear, it was never fair to judge LLMs for specifically this. The LLM doesn't even see the letters in the words, as every word is broken down into tokens, which are numbers. I suppose with a big enough corpus of data it might eventually extrapolate which words have which letter from texts describing these words, but normally it shouldn't be expected.

                            K This user is from outside of this forum
                            K This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #32

                            I've actually messed with this a bit. The problem is more that it can't count to begin with. If you ask it to spell out each letter individually (ie each letter will be its own token), it still gets the count wrong.

                            abfarid@startrek.websiteA 1 Reply Last reply
                            2
                            • cyrano@lemmy.dbzer0.comC [email protected]

                              Next step how many r in Lollapalooza

                              A This user is from outside of this forum
                              A This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #33

                              Apparently, this robot is japanese.

                              jballs@sh.itjust.worksJ 1 Reply Last reply
                              5
                              • R [email protected]

                                It's funny how people always quickly point out that an LLM wasn't made for this, and then continue to shill it for use cases it wasn't made for either (The "intelligence" part of AI, for starters)

                                G This user is from outside of this forum
                                G This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #34

                                Fair point, but a big part of "intelligence" tasks are memorization.

                                B 1 Reply Last reply
                                0
                                • R [email protected]

                                  It's funny how people always quickly point out that an LLM wasn't made for this, and then continue to shill it for use cases it wasn't made for either (The "intelligence" part of AI, for starters)

                                  B This user is from outside of this forum
                                  B This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #35

                                  It's marketed like its AGI, so we should treat it like AGI to show that it isn't AGI. Lots of people buy the bullshit

                                  merc@sh.itjust.worksM 1 Reply Last reply
                                  7
                                  • G [email protected]

                                    Fair point, but a big part of "intelligence" tasks are memorization.

                                    B This user is from outside of this forum
                                    B This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #36

                                    Computers for all intents are purposes have perfect recall so since it was trained on a large data set it would have much better intelligence. But in reality what we consider intelligence is extrapolating from existing knowledge which is what “AI” has shown to be pretty shit at

                                    G 1 Reply Last reply
                                    2
                                    • R [email protected]

                                      It's funny how people always quickly point out that an LLM wasn't made for this, and then continue to shill it for use cases it wasn't made for either (The "intelligence" part of AI, for starters)

                                      R This user is from outside of this forum
                                      R This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #37

                                      There are different types of Artificial intelligences. Counter-Strike 1.6 bots, by definition, were AI. They even used deep learning to figure out new maps.

                                      O 1 Reply Last reply
                                      3
                                      • B [email protected]

                                        Computers for all intents are purposes have perfect recall so since it was trained on a large data set it would have much better intelligence. But in reality what we consider intelligence is extrapolating from existing knowledge which is what “AI” has shown to be pretty shit at

                                        G This user is from outside of this forum
                                        G This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #38

                                        They don't. They can save information on drives, but searching is expensive and fuzzy search is a mystery.

                                        Just because you can save a mp3 without losing data does not mean you can save the entire Internet in 400gb and search within an instant.

                                        B 1 Reply Last reply
                                        1
                                        • cyrano@lemmy.dbzer0.comC [email protected]

                                          Next step how many r in Lollapalooza

                                          S This user is from outside of this forum
                                          S This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by [email protected]
                                          #39

                                          Obligatory 'lore dump' on the word lollapalooza:

                                          That word was a common slang term in the 1930s/40s American lingo that meant... essentially a very raucous, lively party.

                                          ::: spoiler Note/Rant on the meaning of this term

                                          The current merriam webster and dictionary.com definitions of this term meaning 'an outstanding or exceptional or extreme thing' are wrong, they are too broad.

                                          While historical usage varied, it almost always appeared as a noun describing a gathering of many people, one that was so lively or spectacular that you would be exhausted after attending it.

                                          When it did not appear as a noun describing a lively, possibly also 'star-studded' or extravagant, party, it appeared as a term for some kind of action that would cause you to be bamboozled, discombobulated... similar to 'that was a real humdinger of a blahblah' or 'that blahblah was a real doozy'... which ties into the effects of having been through the 'raucous party' meaning of lolapalooza.

                                          :::

                                          So... in WW2, in the Pacific theatre... many US Marines were often engaged in brutal, jungle combat, often at night, and they adopted a system of basically verbal identification challenge checks if they noticed someone creeping up on their foxholes at night.

                                          An example of this system used in the European theatre, I believe by the 101st and 82nd airborne, was the challenge 'Thunder!' to which the correct response was 'Flash!'.

                                          In the Pacific theatre... the Marines adopted a challenge / response system... where the correct response was 'Lolapalooza'...

                                          Because native born Japanese speakers are taught a phoneme that is roughly in between and 'r' and an 'l' ... and they very often struggle to say 'Lolapalooza' without a very noticable accent, unless they've also spent a good deal of time learning spoken English (or some other language with distinct 'l' and 'r' phonemes), which very few Japanese did in the 1940s.

                                          ::: spoiler racist and nsfw historical example of / evidence for this

                                          https://www.ep.tc/howtospotajap/howto06.html

                                          :::

                                          Now, some people will say this is a total myth, others will say it is not.

                                          My Grandpa who served in the Pacific Theatre during WW2 told me it did happen, though he was Navy and not a Marine... but the other stories about this I've always heard that say it did happen, they all say it happened with the Marines.

                                          My Grandpa is also another source for what 'lolapalooza' actually means.

                                          cyrano@lemmy.dbzer0.comC icastfist@programming.devI R A 4 Replies Last reply
                                          2
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups