Elon Musk wants to rewrite "the entire corpus of human knowledge" with Grok
-
What the fuck? This is so unhinged. Genuine question, is he actually this dumb or he's just saying complete bullshit to boost stock prices?
my guess is yes.
-
I read about this in a popular book by some guy named Orwell
Wasn't he the children's author who published the book about a talking animals learning the value of hard work or something?
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::remember when grok called e*on and t**mp a nazi? good times
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::Dude wants to do a lot of things and fails to accomplish what he says he's doing to do or ends up half-assing it. So let him take Grok and run it right into the ground like an autopiloted Cybertruck rolling over into a flame trench of an exploding Startship rocket still on the pad shooting flames out of tunnels made by the Boring Company.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::Lol turns out elon has no fucking idea about how llms work
-
The thing that annoys me most is that there have been studies done on LLMs where, when trained on subsets of output, it produces increasingly noisier output.
Sources (unordered):
- What is model collapse?
- AI models collapse when trained on recursively generated data
- Large Language Models Suffer From Their Own Output: An Analysis of the Self-Consuming Training Loop
- Collapse of Self-trained Language Models
Whatever nonsense Muskrat is spewing, it is factually incorrect. He won't be able to successfully retrain any model on generated content. At least, not an LLM if he wants a successful product. If anything, he will be producing a model that is heavily trained on censored datasets.
wrote last edited by [email protected]It's not so simple, there are papers on zero data 'self play' or other schemes for using other LLM's output.
Distillation is probably the only one you'd want for a pretrain, specifically.
-
asdf
You had started to make a point, now you are just being a dick.
-
Wasn't he the children's author who published the book about a talking animals learning the value of hard work or something?
The very one!
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::So they’re just going to fill it with Hitler’s world view, got it.
Typical and expected.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::He knows more ... about knowledge... than... anyone alive now
-
Lol turns out elon has no fucking idea about how llms work
It's pretty obvious where the white genocide "bug" came from.
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::So just making shit up.
-
“Deleting Errors” should sound alarm bells in your head.
And the adding missing information doesn't. Isn't that just saying we are going to make shit up.
-
So they’re just going to fill it with Hitler’s world view, got it.
Typical and expected.
I mean, this is the same guy who said we'd be living on Mars in 2025.
-
asdf
Books are not immune to being written by LLMs spewing nonsense, lies, and hallucinations, which will only make more traditional issue of author/publisher biases worse. The asymmetry between how long it takes to create misinformation and how long it takes to verify it has never been this bad.
Media literacy will be very important going forward for new informational material and there will be increasing demand for pre-LLM materials.
-
You had started to make a point, now you are just being a dick.
wrote last edited by [email protected]asdf
-
You have to have data to apply your logic too.
If it is raining, the sidewalk is wet. Does that mean if the sidewalk is wet, that it is raining?
There are domains of human knowledge that we will never have data on. There’s no logical way for me to 100% determine what was in Abraham Lincoln’s pockets on the day he was shot.
When you read real academic texts, you’ll notice that there is always the “this suggests that,” “we can speculate that,” etc etc. The real world is not straight math and binary logic. The closest fields to that might be physics and chemistry to a lesser extent, but even then - theoretical physics must be backed by experimentation and data.
Thanks I've never heard of data. And I've never read an academic text either. Condescending pos
So, while I'm ironing out your logic for you, "what else would you rely on, if not logic, to prove or disprove and ascertain knowledge about gaps?"
-
Books are not immune to being written by LLMs spewing nonsense, lies, and hallucinations, which will only make more traditional issue of author/publisher biases worse. The asymmetry between how long it takes to create misinformation and how long it takes to verify it has never been this bad.
Media literacy will be very important going forward for new informational material and there will be increasing demand for pre-LLM materials.
wrote last edited by [email protected]asdf
-
There are thousands of backups of wikipedia, and you can download the entire thing legally, for free.
He'll never be rid of it.
Wikipedia may even outlive humanity, ever so slightly.
Seconds after the last human being dies, the Wikipedia page is updated to read:
Humans (Homo sapiens) or modern humans were the most common and widespread species of primate
-
We will use Grok 3.5 (maybe we should call it 4), which has advanced reasoning, to rewrite the entire corpus of human knowledge, adding missing information and deleting errors.
Then retrain on that.
Far too much garbage in any foundation model trained on uncorrected data.
::: spoiler More Context
Source.
:::"and then on retrain on that"
Thats called model collapse.