More VRAM is better - and expensive ...
" AI needs horsepower and Ollama needs GPUs, but you don't have to run out and hand over your life savings to get an RTX 5090, either."
"The RTX 3060 is something of a darling of the AI community because of its 12GB of VRAM and its relatively low cost. Memory bandwidth is significantly lower, but so is the TDP at just 170W. You could have two of these and match the TDP and total VRAM of an RTX 3090, while spending much less."
No comments:
Post a Comment