With Ollama, all you have do is copy an extra folder of ROCm files. Not hard at all.
Posts
-
Consumer GPUs to run LLMs -
Consumer GPUs to run LLMsWith an AMD RX 6800 + 32gb DDR4, I can run up to a 34b model at an acceptable speed.
-
If you are in America and are expressing pro-Palestinian views online, you should seriously consider yourself a legitimate target for arrest or abduction by the state.Since all this blew up, I've been using tor and i2pd for everything.
-
Anubis - Weighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlersWhat I'm thinking about is more that in Linux, it's common to access URLs directly from the terminal for various purposes, instead of using a browser.
-
Anubis - Weighs the soul of incoming HTTP requests using proof-of-work to stop AI crawlersSo if you try to access a website using this technology via terminal, what happens? The connection fails?