OpenAI Furious DeepSeek Might Have Stolen All the Data OpenAI Stole From Us
-
[email protected]replied to [email protected] last edited by
AMD apparently has the 7900 XTX outperforming the 4090 in Deepseek.
-
[email protected]replied to [email protected] last edited by
Those aren't commercial GPUs though. These are:
-
[email protected]replied to [email protected] last edited by
I definitely understand that reaction. It does give off a whiff of unprofessionalism, but their reporting is so consistently solid that I’m willing to give them the space to be a little more human than other journalists. If it ever got in the way of their actual journalism I’d say they should quit it, but that hasn’t happened so far.
-
[email protected]replied to [email protected] last edited by
Tree fiddy 🦕
-
[email protected]replied to [email protected] last edited by
I always thought Rob Reiner had a similar sense of humor to Mel Brooks. And I liked Billy Crystal in it, it kept that section of the movie from feeling too heavy, though I get it's not everyone's thing.
For anyone who hasn't read it, the book is fantastic as well, and helped me appreciate the movie even more (it's probably one of the best film adaptations of a book ever, IMO). The humor and wit of William Goldman was captured expertly in the movie.
-
[email protected]replied to [email protected] last edited by
The battle of the plagiarism machines has begun
-
[email protected]replied to [email protected] last edited by
-
[email protected]replied to [email protected] last edited by
Hell Nvidia's stock plummeted as well, which makes no sense at all, considering Deepseek needs the same hardware as ChatGPT.
It's the same hardware, the problem for them is that deepseek found a way to train their AI for much cheaper using a lot less than the hundreds of thousands of GPUs from Nvidia that openai, meta, xAi, anthropic etc. uses
-
[email protected]replied to [email protected] last edited by
Wasn't zuck the cuck saying "privacy is dead" a few years ago
-
[email protected]replied to [email protected] last edited by
DeepSeek’s actual trained model is immaterial—they could take it down tomorrow and never provide access again, and the damage to OpenAI’s business would still be done. The point is that any organization with a few million dollars and some (hopefully less-problematical) training data can now make their own model competitive with OpenAI’s.
-
[email protected]replied to [email protected] last edited by
Rob Reiner's dad Carl was best friends with Mel Brooks for almost all of Carl's adult life.
https://www.vanityfair.com/hollywood/2020/06/carl-reiner-mel-brooks-friendship
-
[email protected]replied to [email protected] last edited by
I'm not all that knowledgeable either lol it is my understanding though that what you download, the "model," is the results of their training. You would need some other way to train it. I'm not sure how you would go about doing that though. The model is essentially the "product" that is created from the training.
-
[email protected]replied to [email protected] last edited by
How do you know it isn't communicating with their servers? Obviously it needs internet connection to work, so what's stopping it from sending your data?
-
[email protected]replied to [email protected] last edited by
Why do you think it needs an Internet connection? Why are you saying 'obviously'
-
[email protected]replied to [email protected] last edited by
The way they found to train their AI cheaper was to steal it from OpenAI (not that I care). They still need GPUs to process the prompts and generate the responses.
-
[email protected]replied to [email protected] last edited by
It's called distilling the data, to turn the huge amount of data into a compact amount that can be used in another model.
-
[email protected]replied to [email protected] last edited by
If these guys thought they could out-bootleg the fucking Chinese then I have an unlicensed t-shirt of Nicky Mouse with their name on it.
-
[email protected]replied to [email protected] last edited by
CUDA being taken down a peg is the best part for me. Fuck proprietary APIs.
-
[email protected]replied to [email protected] last edited by
How else does it figure out what to say if it doesn't have the access to the internet? Genuine question, I don't imagine you're dowloading the entire dataset with the model.
-
[email protected]replied to [email protected] last edited by
Tamaleeeeeeeeesssssss
hot hot hot hot tamaleeeeeeeees