Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Elon Musk wants to rewrite "the entire corpus of human knowledge" with Grok

Elon Musk wants to rewrite "the entire corpus of human knowledge" with Grok

Scheduled Pinned Locked Moved Technology
technology
198 Posts 158 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • hossenfeffer@feddit.ukH [email protected]

    He's been frustrated by the fact that he can't make Wikipedia 'tell the truth' for years. This will be his attempt to replace it.

    W This user is from outside of this forum
    W This user is from outside of this forum
    [email protected]
    wrote last edited by
    #151

    There are thousands of backups of wikipedia, and you can download the entire thing legally, for free.

    He'll never be rid of it.

    Wikipedia may even outlive humanity, ever so slightly.

    T S 2 Replies Last reply
    1
    • b3an@lemmy.worldB [email protected]

      That is definitely how I read it.

      History can’t just be ‘rewritten’ by A.I. and taken as truth. That’s fucking stupid.

      G This user is from outside of this forum
      G This user is from outside of this forum
      [email protected]
      wrote last edited by
      #152

      It's truth in Whitemanistan though

      1 Reply Last reply
      0
      • N [email protected]

        Whatever. The next generation will have to learn to trust whether the material is true or not by using sources like Wikipedia or books by well-regarded authors.

        The other thing that he doesn't understand (and most "AI" advocates don't either) is that LLMs have nothing to do with facts or information. They're just probabilistic models that pick the next word(s) based on context. Anyone trying to address the facts and information produced by these models is completely missing the point.

        theneverfox@pawb.socialT This user is from outside of this forum
        theneverfox@pawb.socialT This user is from outside of this forum
        [email protected]
        wrote last edited by
        #153

        The other thing that he doesn't understand (and most "AI" advocates don't either) is that LLMs have nothing to do with facts or information. They're just probabilistic models that pick the next word(s) based on context.

        That's a massive oversimplification, it's like saying humans don't remember things, we just have neurons that fire based on context

        LLMs do actually "know" things. They work based on tokens and weights, which are the nodes and edges of a high dimensional graph. The llm traverses this graph as it processes inputs and generates new tokens

        You can do brain surgery on an llm and change what it knows, we have a very good understanding of how this works. You can change a single link and the model will believe the Eiffel tower is in Rome, and it'll describe how you have a great view of the colosseum from the top

        The problem is that it's very complicated and complex, researchers are currently developing new math to let us do this in a useful way

        1 Reply Last reply
        2
        • W [email protected]

          There are thousands of backups of wikipedia, and you can download the entire thing legally, for free.

          He'll never be rid of it.

          Wikipedia may even outlive humanity, ever so slightly.

          T This user is from outside of this forum
          T This user is from outside of this forum
          [email protected]
          wrote last edited by
          #154

          Wikipedia may even outlive humanity, ever so slightly.

          Yes

          1 Reply Last reply
          0
          • J [email protected]

            An interesting thought experiment: I think he's full of shit, you think he's full of himself. Maybe there's a "theory of everything" here somewhere. E = shit squared?

            T This user is from outside of this forum
            T This user is from outside of this forum
            [email protected]
            wrote last edited by
            #155

            He is a little shit, he's full of shit, ergo he's full of himself

            1 Reply Last reply
            2
            • P [email protected]

              We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

              Then retrain on that.

              Far too much garbage in any foundation model trained on uncorrected data.

              Source.

              ::: spoiler More Context

              Source.

              Source.
              :::

              F This user is from outside of this forum
              F This user is from outside of this forum
              [email protected]
              wrote last edited by
              #156

              I read about this in a popular book by some guy named Orwell

              P 1 Reply Last reply
              3
              • P [email protected]

                We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                Then retrain on that.

                Far too much garbage in any foundation model trained on uncorrected data.

                Source.

                ::: spoiler More Context

                Source.

                Source.
                :::

                F This user is from outside of this forum
                F This user is from outside of this forum
                [email protected]
                wrote last edited by
                #157

                By the way, when you refuse to band together, organize, and dispose of these people, they entrench themselves further in power. Everyone ignored Kari Lake as a harmless kook and she just destroyed Voice of America. That loudmouthed MAGA asshole in your neighborhood is going to commit a murder.

                1 Reply Last reply
                1
                • P [email protected]

                  We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                  Then retrain on that.

                  Far too much garbage in any foundation model trained on uncorrected data.

                  Source.

                  ::: spoiler More Context

                  Source.

                  Source.
                  :::

                  V This user is from outside of this forum
                  V This user is from outside of this forum
                  [email protected]
                  wrote last edited by
                  #158

                  What the fuck? This is so unhinged. Genuine question, is he actually this dumb or he's just saying complete bullshit to boost stock prices?

                  C 1 Reply Last reply
                  6
                  • P [email protected]

                    We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                    Then retrain on that.

                    Far too much garbage in any foundation model trained on uncorrected data.

                    Source.

                    ::: spoiler More Context

                    Source.

                    Source.
                    :::

                    S This user is from outside of this forum
                    S This user is from outside of this forum
                    [email protected]
                    wrote last edited by
                    #159

                    Fuck Elon Musk

                    1 Reply Last reply
                    6
                    • P [email protected]

                      We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                      Then retrain on that.

                      Far too much garbage in any foundation model trained on uncorrected data.

                      Source.

                      ::: spoiler More Context

                      Source.

                      Source.
                      :::

                      W This user is from outside of this forum
                      W This user is from outside of this forum
                      [email protected]
                      wrote last edited by
                      #160

                      Yes! We should all wholeheartedly support this GREAT INNOVATION! There is NOTHING THAT COULD GO WRONG, so this will be an excellent step to PERMANENTLY PERFECT this WONDERFUL AI.

                      1 Reply Last reply
                      3
                      • thepowerofgeek@lemmy.worldT [email protected]

                        That's not how knowledge works. You can't just have an LLM hallucinate in missing gaps in knowledge and call it good.

                        W This user is from outside of this forum
                        W This user is from outside of this forum
                        [email protected]
                        wrote last edited by
                        #161

                        SHH!! Yes you can, Elon! recursively training your model on itself definitely has NO DOWNSIDES

                        1 Reply Last reply
                        1
                        • M [email protected]

                          We will take the entire library of human knowledge, cleans it, and ensure our version is the only record available.

                          The only comfort I have is knowing anything that is true can be relearned by observing reality through the lense of science, which is itself reproducible from observing how we observe reality.

                          W This user is from outside of this forum
                          W This user is from outside of this forum
                          [email protected]
                          wrote last edited by
                          #162

                          Have some more comfort

                          1 Reply Last reply
                          0
                          • P [email protected]

                            We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                            Then retrain on that.

                            Far too much garbage in any foundation model trained on uncorrected data.

                            Source.

                            ::: spoiler More Context

                            Source.

                            Source.
                            :::

                            N This user is from outside of this forum
                            N This user is from outside of this forum
                            [email protected]
                            wrote last edited by
                            #163

                            Huh. I'm not sure if he's understood the alignment problem quite right.

                            1 Reply Last reply
                            6
                            • V [email protected]

                              What the fuck? This is so unhinged. Genuine question, is he actually this dumb or he's just saying complete bullshit to boost stock prices?

                              C This user is from outside of this forum
                              C This user is from outside of this forum
                              [email protected]
                              wrote last edited by
                              #164

                              my guess is yes.

                              1 Reply Last reply
                              4
                              • F [email protected]

                                I read about this in a popular book by some guy named Orwell

                                P This user is from outside of this forum
                                P This user is from outside of this forum
                                [email protected]
                                wrote last edited by
                                #165

                                Wasn't he the children's author who published the book about a talking animals learning the value of hard work or something?

                                F rmuk@feddit.ukR 2 Replies Last reply
                                2
                                • P [email protected]

                                  We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                                  Then retrain on that.

                                  Far too much garbage in any foundation model trained on uncorrected data.

                                  Source.

                                  ::: spoiler More Context

                                  Source.

                                  Source.
                                  :::

                                  P This user is from outside of this forum
                                  P This user is from outside of this forum
                                  [email protected]
                                  wrote last edited by
                                  #166

                                  remember when grok called e*on and t**mp a nazi? good times

                                  1 Reply Last reply
                                  1
                                  • P [email protected]

                                    We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                                    Then retrain on that.

                                    Far too much garbage in any foundation model trained on uncorrected data.

                                    Source.

                                    ::: spoiler More Context

                                    Source.

                                    Source.
                                    :::

                                    N This user is from outside of this forum
                                    N This user is from outside of this forum
                                    [email protected]
                                    wrote last edited by
                                    #167

                                    Dude wants to do a lot of things and fails to accomplish what he says he's doing to do or ends up half-assing it. So let him take Grok and run it right into the ground like an autopiloted Cybertruck rolling over into a flame trench of an exploding Startship rocket still on the pad shooting flames out of tunnels made by the Boring Company.

                                    1 Reply Last reply
                                    1
                                    • P [email protected]

                                      We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                                      Then retrain on that.

                                      Far too much garbage in any foundation model trained on uncorrected data.

                                      Source.

                                      ::: spoiler More Context

                                      Source.

                                      Source.
                                      :::

                                      A This user is from outside of this forum
                                      A This user is from outside of this forum
                                      [email protected]
                                      wrote last edited by
                                      #168

                                      Lol turns out elon has no fucking idea about how llms work

                                      K 1 Reply Last reply
                                      17
                                      • J [email protected]

                                        The thing that annoys me most is that there have been studies done on LLMs where, when trained on subsets of output, it produces increasingly noisier output.

                                        Sources (unordered):

                                        • What is model collapse?
                                        • AI models collapse when trained on recursively generated data
                                        • Large Language Models Suffer From Their Own Output: An Analysis of the Self-Consuming Training Loop
                                        • Collapse of Self-trained Language Models

                                        Whatever nonsense Muskrat is spewing, it is factually incorrect. He won't be able to successfully retrain any model on generated content. At least, not an LLM if he wants a successful product. If anything, he will be producing a model that is heavily trained on censored datasets.

                                        B This user is from outside of this forum
                                        B This user is from outside of this forum
                                        [email protected]
                                        wrote last edited by [email protected]
                                        #169

                                        It's not so simple, there are papers on zero data 'self play' or other schemes for using other LLM's output.

                                        Distillation is probably the only one you'd want for a pretrain, specifically.

                                        1 Reply Last reply
                                        2
                                        • A [email protected]

                                          asdf

                                          A This user is from outside of this forum
                                          A This user is from outside of this forum
                                          [email protected]
                                          wrote last edited by
                                          #170

                                          You had started to make a point, now you are just being a dick.

                                          A 1 Reply Last reply
                                          1
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups