Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Elon Musk wants to rewrite "the entire corpus of human knowledge" with Grok

Elon Musk wants to rewrite "the entire corpus of human knowledge" with Grok

Scheduled Pinned Locked Moved Technology
technology
198 Posts 158 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • I [email protected]

    Just because Wikipedia offers a list of references doesn't mean that those references reflect what knowledge is actually out there. Wikipedia is trying to be academically rigorous without any of the real work. A big part of doing academic research is reading articles and studies that are wrong or which prove the null hypothesis. That's why we need experts and not just an AI to regurgitate information. Wikipedia is useful if people understand it's limitations, I think a lot of people don't though.

    thegreenwizard@lemmy.zipT This user is from outside of this forum
    thegreenwizard@lemmy.zipT This user is from outside of this forum
    [email protected]
    wrote last edited by
    #123

    For sure, Wikipedia is for the most basic subjects to research, or the first step of doing any research (they could still offer helpful sources) . For basic stuff, or quick glances of something for conversation.

    W 1 Reply Last reply
    3
    • K [email protected]

      That was my first impression, but then it shifted into "I want my AI to be the shittiest of them all".

      diplomjodler3@lemmy.worldD This user is from outside of this forum
      diplomjodler3@lemmy.worldD This user is from outside of this forum
      [email protected]
      wrote last edited by
      #124

      Why not both?

      1 Reply Last reply
      6
      • A [email protected]

        asdf

        diplomjodler3@lemmy.worldD This user is from outside of this forum
        diplomjodler3@lemmy.worldD This user is from outside of this forum
        [email protected]
        wrote last edited by
        #125

        So what would you consider to be a trustworthy source?

        1 Reply Last reply
        0
        • P [email protected]

          We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

          Then retrain on that.

          Far too much garbage in any foundation model trained on uncorrected data.

          Source.

          ::: spoiler More Context

          Source.

          Source.
          :::

          hossenfeffer@feddit.ukH This user is from outside of this forum
          hossenfeffer@feddit.ukH This user is from outside of this forum
          [email protected]
          wrote last edited by
          #126

          He's been frustrated by the fact that he can't make Wikipedia 'tell the truth' for years. This will be his attempt to replace it.

          W 1 Reply Last reply
          38
          • P [email protected]

            We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

            Then retrain on that.

            Far too much garbage in any foundation model trained on uncorrected data.

            Source.

            ::: spoiler More Context

            Source.

            Source.
            :::

            R This user is from outside of this forum
            R This user is from outside of this forum
            [email protected]
            wrote last edited by
            #127

            I never would have thought it possible that a person could be so full of themselves to say something like that

            J 1 Reply Last reply
            17
            • M [email protected]

              The plan to "rewrite the entire corpus of human knowledge" with AI sounds impressive until you realize LLMs are just pattern-matching systems that remix existing text. They can't create genuinely new knowledge or identify "missing information" that wasn't already in their training data.

              W This user is from outside of this forum
              W This user is from outside of this forum
              [email protected]
              wrote last edited by
              #128

              To be fair, your brain is a pattern-matching system.

              When you catch a ball, you’re not doing the physics calculations in your head- you’re making predictions based on an enormous quantity of input. Unless you’re being very deliberate, you’re not thinking before you speak every word- your brain’s predictive processing takes over and you often literally speak before you think.

              Fuck LLMs- but I think it’s a bit wild to dismiss the power of a sufficiently advanced pattern-matching system.

              Z 1 Reply Last reply
              2
              • D [email protected]

                I keep a partial local copy of Wikipedia on my phone and backup device with an app called Kiwix. Great if you need access to certain items in remote areas with no access to the internet.

                W This user is from outside of this forum
                W This user is from outside of this forum
                [email protected]
                wrote last edited by
                #129

                They may laugh now, but you're gonna kick ass when you get isekai'd.

                1 Reply Last reply
                1
                • thegreenwizard@lemmy.zipT [email protected]

                  For sure, Wikipedia is for the most basic subjects to research, or the first step of doing any research (they could still offer helpful sources) . For basic stuff, or quick glances of something for conversation.

                  W This user is from outside of this forum
                  W This user is from outside of this forum
                  [email protected]
                  wrote last edited by [email protected]
                  #130

                  This very much depends on the subject, I suspect. For math or computer science, wikipedia is an excellent source, and the credentials of the editors maintaining those areas are formidable (to say the least). Their explanations of the underlaying mechanisms are in my experience a little variable in quality, but I haven't found one that's even close to outright wrong.

                  1 Reply Last reply
                  1
                  • P [email protected]

                    We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                    Then retrain on that.

                    Far too much garbage in any foundation model trained on uncorrected data.

                    Source.

                    ::: spoiler More Context

                    Source.

                    Source.
                    :::

                    tedde@lemmy.worldT This user is from outside of this forum
                    tedde@lemmy.worldT This user is from outside of this forum
                    [email protected]
                    wrote last edited by
                    #131

                    Unironically Orwellian

                    1 Reply Last reply
                    2
                    • P [email protected]

                      We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                      Then retrain on that.

                      Far too much garbage in any foundation model trained on uncorrected data.

                      Source.

                      ::: spoiler More Context

                      Source.

                      Source.
                      :::

                      M This user is from outside of this forum
                      M This user is from outside of this forum
                      [email protected]
                      wrote last edited by
                      #132

                      I'm just seeing bakes in the lies.

                      1 Reply Last reply
                      3
                      • S [email protected]

                        Wikipedia is not a trustworthy source of information for anything regarding contemporary politics or economics.

                        Wikipedia presents the views of reliable sources on notable topics. The trick is what sources are considered "reliable" and what topics are "notable", which is why it's such a poor source of information for things like contemporary politics in particular.

                        A This user is from outside of this forum
                        A This user is from outside of this forum
                        [email protected]
                        wrote last edited by [email protected]
                        #133

                        asdf

                        S 1 Reply Last reply
                        0
                        • S [email protected]

                          Wikipedia is not a trustworthy source of information for anything regarding contemporary politics or economics.

                          Wikipedia presents the views of reliable sources on notable topics. The trick is what sources are considered "reliable" and what topics are "notable", which is why it's such a poor source of information for things like contemporary politics in particular.

                          A This user is from outside of this forum
                          A This user is from outside of this forum
                          [email protected]
                          wrote last edited by [email protected]
                          #134

                          asdf

                          G 1 Reply Last reply
                          2
                          • R [email protected]

                            I never would have thought it possible that a person could be so full of themselves to say something like that

                            J This user is from outside of this forum
                            J This user is from outside of this forum
                            [email protected]
                            wrote last edited by
                            #135

                            An interesting thought experiment: I think he's full of shit, you think he's full of himself. Maybe there's a "theory of everything" here somewhere. E = shit squared?

                            T 1 Reply Last reply
                            3
                            • P [email protected]

                              We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                              Then retrain on that.

                              Far too much garbage in any foundation model trained on uncorrected data.

                              Source.

                              ::: spoiler More Context

                              Source.

                              Source.
                              :::

                              softestsapphic@lemmy.worldS This user is from outside of this forum
                              softestsapphic@lemmy.worldS This user is from outside of this forum
                              [email protected]
                              wrote last edited by
                              #136

                              I remember when I learned what corpus meant too

                              1 Reply Last reply
                              8
                              • N [email protected]

                                Elon Musk, like most pseudo intellectuals, has a very shallow understanding of things. Human knowledge is full of holes, and they cannot simply be resolved through logic, which Mush the dweeb imagines.

                                why0y@lemmy.mlW This user is from outside of this forum
                                why0y@lemmy.mlW This user is from outside of this forum
                                [email protected]
                                wrote last edited by
                                #137

                                Uh, just a thought. Please pardon, I'm not an Elon shill, I just think your argument phrasing is off.

                                How would you know there are holes in understanding, without logic. How would you remedy gaps of understanding in human knowledge, without the application of logic to find things are consistent?

                                A 1 Reply Last reply
                                0
                                • A [email protected]

                                  asdf

                                  S This user is from outside of this forum
                                  S This user is from outside of this forum
                                  [email protected]
                                  wrote last edited by
                                  #138

                                  Again, read the rest of the comment. Wikipedia very much repeats the views of reliable sources on notable topics - most of the fuckery is in deciding what counts as "reliable" and "notable".

                                  A 1 Reply Last reply
                                  2
                                  • S [email protected]

                                    Again, read the rest of the comment. Wikipedia very much repeats the views of reliable sources on notable topics - most of the fuckery is in deciding what counts as "reliable" and "notable".

                                    A This user is from outside of this forum
                                    A This user is from outside of this forum
                                    [email protected]
                                    wrote last edited by [email protected]
                                    #139

                                    asdf

                                    A 1 Reply Last reply
                                    0
                                    • why0y@lemmy.mlW [email protected]

                                      Uh, just a thought. Please pardon, I'm not an Elon shill, I just think your argument phrasing is off.

                                      How would you know there are holes in understanding, without logic. How would you remedy gaps of understanding in human knowledge, without the application of logic to find things are consistent?

                                      A This user is from outside of this forum
                                      A This user is from outside of this forum
                                      [email protected]
                                      wrote last edited by [email protected]
                                      #140

                                      You have to have data to apply your logic too.

                                      If it is raining, the sidewalk is wet. Does that mean if the sidewalk is wet, that it is raining?

                                      There are domains of human knowledge that we will never have data on. There’s no logical way for me to 100% determine what was in Abraham Lincoln’s pockets on the day he was shot.

                                      When you read real academic texts, you’ll notice that there is always the “this suggests that,” “we can speculate that,” etc etc. The real world is not straight math and binary logic. The closest fields to that might be physics and chemistry to a lesser extent, but even then - theoretical physics must be backed by experimentation and data.

                                      why0y@lemmy.mlW 1 Reply Last reply
                                      7
                                      • P [email protected]

                                        We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                                        Then retrain on that.

                                        Far too much garbage in any foundation model trained on uncorrected data.

                                        Source.

                                        ::: spoiler More Context

                                        Source.

                                        Source.
                                        :::

                                        J This user is from outside of this forum
                                        J This user is from outside of this forum
                                        [email protected]
                                        wrote last edited by
                                        #141

                                        The thing that annoys me most is that there have been studies done on LLMs where, when trained on subsets of output, it produces increasingly noisier output.

                                        Sources (unordered):

                                        • What is model collapse?
                                        • AI models collapse when trained on recursively generated data
                                        • Large Language Models Suffer From Their Own Output: An Analysis of the Self-Consuming Training Loop
                                        • Collapse of Self-trained Language Models

                                        Whatever nonsense Muskrat is spewing, it is factually incorrect. He won't be able to successfully retrain any model on generated content. At least, not an LLM if he wants a successful product. If anything, he will be producing a model that is heavily trained on censored datasets.

                                        B 1 Reply Last reply
                                        14
                                        • P [email protected]

                                          We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.

                                          Then retrain on that.

                                          Far too much garbage in any foundation model trained on uncorrected data.

                                          Source.

                                          ::: spoiler More Context

                                          Source.

                                          Source.
                                          :::

                                          J This user is from outside of this forum
                                          J This user is from outside of this forum
                                          [email protected]
                                          wrote last edited by
                                          #142

                                          which has advanced reasoning

                                          No it doesn't.

                                          1 Reply Last reply
                                          10
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups