Chuck Shultz

GPU Upgrade Question

Wanting to upgrade from a GTX 960 to something between 1060-1080 to move animation to GPu (. My Intel cpu running at 4,2Ghz.).

Any recommendations? Any real advantage for GPU to have 11GB vs 8GB of memory?


  • it's not a collective sum but the GB's available on one card that limits the scene size you're talking about so 3 3GB gpu's is still only 3GB scene size that Blender recognizes and uses
  • thanks   

  • Just a friendly heads up about recent conditions: Now is definitely a relatively bad time to buy a new card, especially in those model ranges. Why?


    Video cards (both ATI and Nvidia) are getting bought en masse because of the current crypo gold-rush going on. Store shelves are stripped bare, and cards have been marked up significantly simply due to sudden explosion in demand. Lotsa people got it in their head that they're gonna get rich real fast if they buy thirty 1080 TI's and run up their electric bill. :P

    A 960 is a solid card that'll carry you pretty far if you're smart with optimization. I use a 970 myself.

    If your renders are REALLY capping out and it's hindering your progress as an artist, perhaps there's not much of a choice; especially if you have that forward momentum going on!

     However, if you're like me, and you can get away with further optimizing your workflow instead (such as breaking scenes into smaller parts, for instance), there's speculation that all those cards are going to hit ebay real quick once mining stops being worth it. Not to mention, you'll learn all sorts of good ways to optimize in the process, and that will serve you well no matter what hardware you're rocking. :)

    Hope this insight proves even the slightest bit helpful. Good luck!

  • GTX 1060 3GB works well for me, although I'm not pushing it super hard. My CPU is only 3GHz though so I probably see a bigger improvement.

  • Thanks....I am running my cpu at 4.2ghz so usually use CPU for rendering as far as speed.....a lot of my final output is AVI not Stills... so am thinking of building a new CPU just for rendering AVI 10Sec+ renders....

  • Thanks....i was wondering why prices jumped so fast last 6 months while the stock for Nividia is climbing....

    Looked a Youtube for tuning a 960 but not finding anything that made sense.... anyone have a place that does a good job showing how to tune a 960?

    I am looking for doing AVI greater then 10sec for final product but right now I keep getting "out of memory" errors...

  • ffxswan The "out of memory" error occurs when your scene is too large to be loaded for rendering. So that's not a matter of whether or not your GPU can render the scene, it's just having trouble loading the meshes. This can be fixed by splitting up your scene into render layers, then compositing them back together after rendering. Not too much hassle and can help you get the most out of your GPU.

  • thanks....good tip

  • Card prices are crazy because of mining, that being said,  AMD works well in Blender. And typically you can get the results of Nvida rendering at half the price with AMD cards.  The drivers are not as polished as Nvida, and unless you care about gaming, AMD cards perform well for 3D workload stuffs.  I‘m not against Nvida I have a card laying around, but with 2 AMD gpus or 2 of the AMD workstations cards - which house 2 gpus each, you are going to get nearly the same or slightly better performance in some cases for less the Titan V for instance near 1-2k less depending on the cards route.

    Look at stock market analysts remarking on how AMD is gaining ground on Nivida specifically because of crypto mining.  Heavy computational work loads that make money, and AMD cards seem to out perform Nvida cards in hashrates.  I was looking at TitanXP‘s and decided to go with Vega FEs.  They have more memory and I could get a closed loop water cooled card for the same price as an air cooled Titan Xp. If you have the money I would consider FirePros which are near half price of their Nvida counterpart, and the performance difference is not nearly as large as the price. 

    Also note - the entire Intel CPU fiasco that reduced chip speed by 30% to fix a gapping problem. Another win for AMD and threadrippers. 

  • Doesn't Blender only support Nvidia?

  • 2.79 began "full" openCL support (I think-via amd) and it should only get better from here. Prior to, it seemed to be hit or miss. I played only a little with 2  versions before 2.79 and my commitment to a new hobby. The first version my amd card was fine,  The next, I was trying to change code to make things work.  I got an Nvida 1070. Then 2.79 came out and I opted for Vegas due the new support.

  • Was wondering about AMD cards now... I tend to stick to Nvidia as I run multi-monitors (3 or 4) and they seem to work better with it.... anyone currently running 2.79 with multi monitors using an AMD video card? Thanks