New Intel Processor and 192 GB/256 GB RAM
-
Hello, I'm currently thinking back and forth about which new home server to build.
What I've stumbled across: the i9 and new Core 9 Ultra all only support a maximum of 192GB RAM.
However, some of the mainboards support 256GB (with 4 RAM banks and dual channel). Why?I want to have the option of maxing out the RAM later.
I could buy 4x48GB RAM now and be at 192GB.
Maybe I would be annoyed later that 48 GB of RAM is still “missing”.
But what if I buy 4x 64GB RAM?
3x64 GB RAM makes no sense, because then dual channel is not used. 4x64 is probably not recognized by the processor?Or are there LGA1851 or LGA1700 processors, capable of handling 256GB RAM?
-
S [email protected] shared this topic
-
256gb of ram seems well beyond standard self-hosting, what are you planning on running?!
-
If i was considering one server with 256gb ram i would go for server hardware and not try to use consumer stuff.
-
I agree with CameronDev, not so much on the capacity, but the bandwidth. At 100+ Gb, the Ryzen/Core platforms are really holding you back with their weak I/O.
If you need that much memory, you might be better off picking up a used Xeon/Epyc from Ebay. Their CPU speeds are lower, but the quad channel RAM could make up for it, depending on what you're trying to do.
-
The real question is: why do you need this much memory?
If it's not actually going to be used, you're spending more money acquiring it now than you would later.
-
Intel CPU RAM limits often are wrong for some reason. If a Mainboard coming with that CPU supports more, it'll probably work. I usually try to search forums to see if someone uses the same configuration and how much RAM they got to work.
-
But aside from buying a real truck instead of a typhoon, intels memory support might not be hard limit. It probaly is but it might not be.
More likely the mb’s memory controller can handle 256gb so if a new processor comes along with support for 256gb it will work.
-
For clarification: it’s for a proxmox instance. I wanna use the ram for open webzine/ollama.
-
If you're really going to need that much RAM, start looking at servers with multiple sockets. They support absurd amounts of RAM in a single chassis. I think the biggest regularly-available servers have four sockets, but all but the most basic have two.
-
What the heck are you self-hosting that anything beyond 64G is even taken into account?
-
I personally believe you are overbuilding. For example my OpenMediaVault Samba Server and DLNA server runs on a SingleBoard that has 256 megabytes of RAM. Yes MB. And it still has RAM free without swap.
-
I'd say this is the correct answer. If you're actually using that much RAM, you probably want it connected to the processor with a wide (fast) bus. I rarely see people do it with desktop or gaming processors. It might be useful for some edge-cases, but usually you want an Epyc processor or something like that, or it's way too much RAM.
-
What is openwebzine? Can't find any info on it.
-
Look up what system vendors will sell for that CPU. If they sell 256 GiB, then you are likely good.
I don't find I ever upgrade after the first couple months. I would max it out or get multi CPU boards wherI cannot afford to max it out.
-
And 4 sticks ate 4 times more prone to break down.
-
I'e seen that some want it to host their own LLM. It's far cheaper to buy DDR5 memory than somehow getting 100+ GB of VRAM. Whether or not this is a good idea is another question
-
sorry, fat fingers on tablet: I mean "open webui".
-
Oh, I'm not using it for OMV and Samba. I'm using it for ollama/open webui with RAM instead of VRAM.
-
My edge case is: I wanna spin up an ai-lxc in proxmox. ollama and open webui. using RAM instead of vram. but it should low on power consumption on idle. thats why I want an intel i-9 oder core ultra 9 with maxed out RAM. it idles on low power, but can run bigger ai-models using RAM instead of VRAM. it would be not so fast like with GPUs, but thats OK.
-
AI inference is memory-bound. So, memory bus width is the main bottleneck. I also do AI on an (old) CPU, but the CPU itself is mainly idle and waiting for the memory. I'd say it'll likely be very slow, like waiting 10 minutes for a longer answer. I believe all the AI people use Apple silicon because of the unified memory and it's bus width. Or some CPU with several memory lanes.