Ask Lemmy @lemmy.world cheese_greater @lemmy.world 23h ago What's a good local and free LLM model for Windows?
The OS isn't as important as the hardware being used.AMD, Nvidia or Intel GPU?How much RAM & vram are you working with? What's your CPU?Generally speaking I would suggest koboldcpp with gemma3.https://github.com/LostRuins/koboldcpp?tab=readme-ov-file#windows-usage-precompiled-binary-recommendedhttps://huggingface.co/mlabonne/gemma-3-27b-it-abliterated-GGUF/blob/main/gemma-3-27b-it-abliterated.q6_k.gguf
Lots of RAM and a good cpu, benefits from cores. if you're comfortable with it being on the slow side. There's other versions of that model optimized for lower vram conditions too.But for better performance 8GB of vram minimum.
The OS isn't as important as the hardware being used.
AMD, Nvidia or Intel GPU?
How much RAM & vram are you working with?
What's your CPU?
Generally speaking I would suggest koboldcpp with gemma3.
https://github.com/LostRuins/koboldcpp?tab=readme-ov-file#windows-usage-precompiled-binary-recommended
https://huggingface.co/mlabonne/gemma-3-27b-it-abliterated-GGUF/blob/main/gemma-3-27b-it-abliterated.q6_k.gguf
What's the minimum requirements for running it?
Lots of RAM and a good cpu, benefits from cores. if you're comfortable with it being on the slow side.
There's other versions of that model optimized for lower vram conditions too.
But for better performance 8GB of vram minimum.