Bizarre story.
-
Bizarre story. China building better LLMs and LLMs being cheaper to train does not mean that nVidia will sell less GPUs when people like Elon Musk and Donald Trump can't shut up about how important "AI" is.
I'm all for the collapse of the AI bubble, though. It's cool and all that all the bankers know IT terms now, but the massive influx of money towards LLMs and the datacenters that run them has not been healthy to the industry or the broader economy.
-
-
It literally defeats NVIDIA's entire business model of "I shit golden eggs and I'm the only one that does and I can charge any price I want for them"
-
Nvidia cards were the only GPUs used to train DeepSeek v3 and R1. So, that narrative still superficially holds. Other stocks like TSMC, ASML, and AMD are also down in pre-market.
-
Yes, but old and "cheap" ones that were not part of the sanctions.
-
Ah, fair. I guess it makes sense that Wall Street is questioning the need for these expensive blackwell gpus when the hopper gpus are already so good?
-
It's more that the newer models are going to need less compute to train and run them.
-
Right. There's indications of 10x to 100x less compute power needed to train the models to an equivalent level. Not a small thing at all.
-
US economy has been running on bubbles for decades, and using bubbles to fuel innovation and growth. It has survived telecom bubble, housing bubble, bubble in the oil sector for multiple times (how do you think fracking came to be?). This is just the start of the AI bubble because its innovations have yet to have a broad-based impact on the economy. Once AI becomes commonplace in aiding in everything we do, that's when valuations will look "normal".
-
Not small but... smaller than you would expect.
Most companies aren't, and shouldn't be, training their own models. Especially with stuff like RAG where you can use the highly trained model with your proprietary offline data with only a minimal performance hit.
What matters is inference and accuracy/validity. Inference being ridiculously cheap (the reason why AI/ML got so popular) and the latter being a whole different can of worms that industry and researchers don't want you to think about (in part because "correct" might still be blatant lies because it is based on human data which is often blatant lies but...).
And for the companies that ARE going to train their own models? They make enough bank that ordering the latest Box from Jensen is a drop in the bucket.