DeepSeek’s “big change” isn’t the performance of its model though; it’s that it is fully open and operates on a fraction of the resources.
-
DeepSeek’s “big change” isn’t the performance of its model though; it’s that it is fully open and operates on a fraction of the resources.
Is alibaba’s model also open weights, open reasoning, free for anyone to run, and runnable (and trainable) on consumer hardware?
-
W [email protected] shared this topic
-
Call it "open weight" if you want, but it's not "fully open". The training data is still proprietary, and the model can't be accurately reproduced. It's proprietary in the same way that llama is proprietary.
-
But I could use it as a starting point for training and build from it with my own data. I could fork it. I couldn't fork llama, I don't have the weights.
-
You can also fork proprietary code that is source available (depending on the specific terms of that particular proprietary license), but that doesn't make it open source.
Fair point about llama not having open weights though. So it's not as proprietary as llama. It still shouldn't be called open source if the training data that it needs to function is proprietary.