Jump to content

Recommended Posts

Posted

Given that mobile devices and in particular laptops are becoming more dominant I would prefer it if Nvidia started placing a much greater focus on power and heat efficiency over total output to bring laptop and desktop cards back in line with each other. They have been sacrificing these for performance for too long. The trend appears to be that more and more pc gamers are using laptops and as next gen console specs seem underwhelming it doesn't look like huge increases in brunt will be fully taken advantage of unlike improvements in performance for non desktop platforms.

Posted

So does this mean you'll be skipping this one and waiting for the 880? ;)

I've been tempted once or twice, but I probably won't be buying another GPU until this 590 breaks, or I build a new rig. Whichever comes first. I'm glad tech seems to have slowed down a bit for a while. It's easier on the wallet.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted (edited)

Yea, reducing the power draw is the thing to go for. The wattages have kept on rising to absurd high in the past years,

despite the power/watt getting better, the actual power draw has just climbed.

 

AMD 7xxx series is a move to the right direction, pretty decent GPU (7750) without a need for auxiliary power.

Seems nvidia 6xx series is also slightly less hungry than 5xx, so maybe the overall trend is reducing the power draw.

 

Now if only AMD could get their act together in the CPU department as well...

 

 

---

PS. I sidegraded from ATI 4870 to AMD 7770, might be tiny bit more powerful as well,

but the main thing is the reduced power, which means reduced heat, which means reduced fan noise,

which means my rig is now pretty quiet overall. The darn CPU still gobbles up 125W,

I'd be happy to swap to something drawing 75W or less. (Intel would provide, but I'm not going there)

Edited by Jarmo
Posted (edited)

It's only within the last year or two that the GPU industry has reached the point of maturity in its development, a point the CPU business reached almost a decade ago, where the principal vendors have essentially formed a truce to end the power consumption wars. For CPUs this was the ~125W range 'achieved' by the Pentium 4. For GPUs it appears the final number has ended up about double that (nVidia's Fermi), with a little leeway given to dual-GPU cards. It's not that they can't engineer products beyond these limits - with some aggressive overclocking, a Bulldozer CPU can pull ~400W, more than most complete systems - but such stretching is left to the end user. If and when we move away from the ubiquitous and somewhat outdated ATX design (not holding my breath) then maybe the figures will see revision, but eh.

 

I don't expect the numbers will go down - neither GPU vendor would willingly go that route if it meant sacrificing the marketing benefit of having the one fastest product - but I'd be happy enough to see it at least stay where it is for the foreseeable future.

 

On a personal note, I paid ~25% again the price of my video card to get a cooler with a noise level that would allow me to retain my sanity. For an overclocked 7950 this means it's still well within reason to engineer an air cooler that can dissipate that ~250W adequately and quietly. I suppose the issue then is engineering an equivalent solution economical enough to use as stock cooling.

 


 

Aside, GTX780 isn't a next gen product anyway, it's more or less a refresh like the 580 was to the 480, so that performance gain quoted is entirely reasonable. It means that the 'full' Kepler (GK110) will never be released as a gaming card and remain solely in the domain of HPC/workstation oriented products. Sure, it would have been interesting to see how it performed with gaming loads, but if this decision is final, then it's reasonable to assume that it was either not possible or at least uneconomic to pursue it. I wouldn't necessarily call it a loss because being so compute-oriented, it may not have been much of an impressive gaming card anyway even if fully realised.

Edited by Humanoid

L I E S T R O N G
L I V E W R O N G

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...