I really hope this is the beginning of massive correction on AI hype.
-
[email protected]replied to [email protected] last edited by
That's generally how tech goes though. You throw hardware at the problem until it works, and then you optimize it to run on laptops and eventually phones. Usually hardware improvements and software optimizations meet somewhere in the middle.
Look at photo and video editing, you used to need a workstation for that, and now you can get most of it on your phone. Surely AI is destined to follow the same path, with local models getting more and more robust until eventually the beefy cloud services are no longer required.
-
[email protected]replied to [email protected] last edited by
...in a cave with Chinese knockoffs!
-
[email protected]replied to [email protected] last edited by
It's about cheap Chinese AI
-
[email protected]replied to [email protected] last edited by
The problem for American tech companies is that they didn't even try to move to stage 2.
OpenAI is hemorrhaging money even on their most expensive subscription and their entire business plan was to hemorrhage money even faster to the point they would use entire power stations to power their data centers. Their plan makes about as much sense as digging your self out of a hole by trying to dig to the other side of the globe.
-
[email protected]replied to [email protected] last edited by
China really has nothing to do with it, it could have been anyone. It's a reaction to realizing that GPT4-equivalent AI models are dramatically cheaper to train than previously thought.
It being China is a noteable detail because it really drives the nail in the coffin for NVIDIA, because China has been fenced off from having access to NVIDIA's most expensive AI GPUs that were thought to be required to pull thia off.
It also makes the USA gov look extremely foolish to have made major foreign policy and relationship sacrifices in order to try to delay China by a few years, when it's January and China has already caught up, and those sacrifices did not pay off
-
[email protected]replied to [email protected] last edited by
Hey, my friends and I would've made it to China if recess was a bit longer.
Seriously though, the goal for something like OpenAI shouldn't be to sell products to end customers, but to license models to companies that sell "solutions." I see these direct to consumer devices similarly to how GPU manufacturers see reference cards or how Valve sees the Steam Deck: they're a proof of concept for others to follow.
OpenAI should be looking to be more like ARM and less like Apple. If they do that, they might just grow into their valuation.
-
[email protected]replied to [email protected] last edited by
Oh US has been doing this kind of thing for decades! This isn't new.
-
[email protected]replied to [email protected] last edited by
Prices rarely, if ever, go down and there is a push across the board to offload things "to the cloud" for a range of reasons.
That said: If your focus is on gaming, AMD is REAL good these days and, if you can get past their completely nonsensical naming scheme, you can often get a really good GPU using "last year's" technology for 500-800 USD (discounted to 400-600 or so).
-
[email protected]replied to [email protected] last edited by
Pelosi says AI frames are fake frames.
-
[email protected]replied to [email protected] last edited by
xx_Pelosi420_xx doesn't settle for incremental upgrades
-
I'm using an Rx6700xt which you can get for about £300 and it works fine.
-
[email protected]replied to [email protected] last edited by
They definitely used to go down, just not since Bitcoin morphed into a speculative mania.
-
[email protected]replied to [email protected] last edited by
Does it still need people spending huge amounts of time to train models?
After doing neural networks, fuzzy logic, etc. in university, I really question the whole usability of what is called "AI" outside niche use cases.
-
[email protected]replied to [email protected] last edited by
Ah, see, the mistake you’re making is actually understanding the topic at hand.
-
[email protected]replied to [email protected] last edited by
I really don't believe the technological lead is massive.
-
[email protected]replied to [email protected] last edited by
Looking at the market cap of Nvidia vs their competitors the market belives it is, considering they just lost more than AMD/Intel and the likes are worth combined and still are valued at $2.9 billion.
And with technology i mean both the performance of their hardware and the software stack they've created, which is a big part of their dominance.
-
[email protected]replied to [email protected] last edited by
I wouldn't be surprised if China spent more on AI development than the west did, sure here we spent tens of billions while China only invested a few million but that few million was actually spent on the development while out of the tens of billions all but 5$ was spent on bonuses and yachts.
-
[email protected]replied to [email protected] last edited by
You joke but there's a lot of grandma/grandpa gamers these days. Remember someone who played PC games back in the 80s would be on their 50s or 60s now. Or even older if they picked up the hobby as an adult in the 80s
-
[email protected]replied to [email protected] last edited by
Yeah. I don't believe market value is a great indicator in this case. In general, I would say that capital markets are rational at a macro level, but not micro. This is all speculation/gambling.
My guess is that AMD and Intel are at most 1 year behind Nvidia when it comes to tech stack. "China", maybe 2 years, probably less.
However, if you can make chips with 80% performance at 10% price, its a win. People can continue to tell themselves that big tech always will buy the latest and greatest whatever the cost. It does not make it true. I mean, it hasn't been true for a really long time. Google, Meta and Amazon already make their own chips. That's probably true for DeepSeek as well.
-
[email protected]replied to [email protected] last edited by
If inputText = "hello" then
Respond.text("hello there")
ElseIf inputText (...)