LLM Local
How much VRAM do I need?
Select a model and get an estimate.
Can this model run on my GPU?
Check model-to-hardware compatibility.