A 7B model can run on a GPU with just 6GB VRAM, provided the GPU has proper compression storage, which is every gpu named nVidia something.
-
A 7B model can run on a GPU with just 6GB VRAM, provided the GPU has proper compression storage, which is every gpu named nVidia something.
If the AI assistant runs locally, this is great. If it uses Cloud, welll, that's going to cost money somehow.
-
System shared this topic onSystem shared this topic