RustyBug Offline Upload & Sell: On
|
jhapeman wrote:
the performance gain from 64 vs 48 GPU cores was negligible. I'm sure with code optimization that will change some,
Which makes me wonder ... how much dependency on GPU cores will code optimitization target?
I mean, will software target optimization toward 8, 16, 24, 32, 48, 64 or 128 cores? It seems that the bell curve of consumers will NOT be using rigs with 64 GPU cores, whereas most everyone will soon have at least 8 GPU cores. What would you consider the developers might consider as a "sweet spot", or will it be skies the limit at more is better, to infinity and beyond?
I'm thinking that the bell curve is going to be in the 16 GPU territory down the road, but that may still be off a ways down the road. Or, asked a different way, how long will it be before 32 GPU (i.e. more than 16) cores are considered "the norm" for software dev? We've seen this game before ... and if history is any indication, the "future" doesn't come around nearly as fast as the premise of "future proofing".
My understanding (in addition to more GPU cores) is that the Max version has double the video encoders ... giving video folks a 2X (ish) boost. So, for a stills application, the basic gain for the Max would be more GPU cores (24 or 32), and the ability to install 64GB RAM (which then doubles the pipes and volume jumps from 200/sec to 400/sec).
So, the question remains if LR can advantage the 64GB (vs. 32GB). And, then if LR can advantage the more GPU cores (24 or 32 vs. 16) to help offset the OP's complaint?
Short question is thus ... is there a compelling reason to have more than 16 GPU cores to help with the OP's LR issue?
|