OpenAI Furious DeepSeek Might Have Stolen All the Data OpenAI Stole From Us
-
[email protected]replied to [email protected] last edited by
-
[email protected]replied to [email protected] last edited by
Hell Nvidia’s stock plummeted as well, which makes no sense at all, considering Deepseek needs the same hardware as ChatGPT.
Common wisdom said that these models need CUDA to run properly, and DeepSeek doesn't.
-
[email protected]replied to [email protected] last edited by
EU is in best way to become a group of dictators having billionaires in their asses as well..
-
[email protected]replied to [email protected] last edited by
Good that 404 are unafraid of tackling issues, but tbh i find the "hahaha" unprofessional and dispense with the informal tone in news.
-
[email protected]replied to [email protected] last edited by
Sure but Nvidia still makes the GPUs needed to run them. And AMD is not really competitive in the commercial GPU market.
-
[email protected]replied to [email protected] last edited by
AMD apparently has the 7900 XTX outperforming the 4090 in Deepseek.
-
[email protected]replied to [email protected] last edited by
Those aren't commercial GPUs though. These are:
-
[email protected]replied to [email protected] last edited by
I definitely understand that reaction. It does give off a whiff of unprofessionalism, but their reporting is so consistently solid that I’m willing to give them the space to be a little more human than other journalists. If it ever got in the way of their actual journalism I’d say they should quit it, but that hasn’t happened so far.
-
[email protected]replied to [email protected] last edited by
Tree fiddy 🦕
-
[email protected]replied to [email protected] last edited by
I always thought Rob Reiner had a similar sense of humor to Mel Brooks. And I liked Billy Crystal in it, it kept that section of the movie from feeling too heavy, though I get it's not everyone's thing.
For anyone who hasn't read it, the book is fantastic as well, and helped me appreciate the movie even more (it's probably one of the best film adaptations of a book ever, IMO). The humor and wit of William Goldman was captured expertly in the movie.
-
[email protected]replied to [email protected] last edited by
The battle of the plagiarism machines has begun
-
[email protected]replied to [email protected] last edited by
-
[email protected]replied to [email protected] last edited by
Hell Nvidia's stock plummeted as well, which makes no sense at all, considering Deepseek needs the same hardware as ChatGPT.
It's the same hardware, the problem for them is that deepseek found a way to train their AI for much cheaper using a lot less than the hundreds of thousands of GPUs from Nvidia that openai, meta, xAi, anthropic etc. uses
-
[email protected]replied to [email protected] last edited by
Wasn't zuck the cuck saying "privacy is dead" a few years ago
-
[email protected]replied to [email protected] last edited by
DeepSeek’s actual trained model is immaterial—they could take it down tomorrow and never provide access again, and the damage to OpenAI’s business would still be done. The point is that any organization with a few million dollars and some (hopefully less-problematical) training data can now make their own model competitive with OpenAI’s.
-
[email protected]replied to [email protected] last edited by
Rob Reiner's dad Carl was best friends with Mel Brooks for almost all of Carl's adult life.
https://www.vanityfair.com/hollywood/2020/06/carl-reiner-mel-brooks-friendship
-
[email protected]replied to [email protected] last edited by
I'm not all that knowledgeable either lol it is my understanding though that what you download, the "model," is the results of their training. You would need some other way to train it. I'm not sure how you would go about doing that though. The model is essentially the "product" that is created from the training.
-
[email protected]replied to [email protected] last edited by
How do you know it isn't communicating with their servers? Obviously it needs internet connection to work, so what's stopping it from sending your data?
-
[email protected]replied to [email protected] last edited by
Why do you think it needs an Internet connection? Why are you saying 'obviously'
-
[email protected]replied to [email protected] last edited by
The way they found to train their AI cheaper was to steal it from OpenAI (not that I care). They still need GPUs to process the prompts and generate the responses.