Jump to content

Dual GPU GTX 690 Announced


Bokishi

Recommended Posts

A $1000 vid card, the new new new high end, made of magnesium alloy and includes built in water chambers

 

http://www.techpower...0-Launched.html

Double that price for the Australian market...

 

Magnesium, huh? I know from experience that it burns very brightly when it gets hot :p

“He who joyfully marches to music in rank and file has already earned my contempt. He has been given a large brain by mistake, since for him the spinal cord would surely suffice.” - Albert Einstein
 

Link to comment
Share on other sites

A $1000 vid card, the new new new high end, made of magnesium alloy and includes built in water chambers

 

http://www.techpower...0-Launched.html

...oooooo, tech porn. Don't tease me so. And it's metallic casing. *fapfap*

 

But I think I can wait for the dual 890 or 1090 or whatever, which will probably need a mini-nuclear reactor to power them.

  • Like 1
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

Why has elegance found so little following? Elegance has the disadvantage that hard work is needed to achieve it and a good education to appreciate it. - Edsger Wybe Dijkstra

Link to comment
Share on other sites

Guess I'd need a bigger PSU then. More power, yaaarrrr.

I wonder if they ran Metro2033 with all the DX11 stuff on. Not seeing a huge improvement (or any improvement really) vs. my 590 in that game.

 

I'd still want it if I could justify the price at the moment, tho. Which I can't. Heh

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

690 is based on far smaller, more power-efficient chips than the 590, as the previous-gen Fermi chips were possibly the most power hungry GPUs ever. If you can run a 590, you can run a 690 with plenty of headroom to spare. I have to say those power numbers in general don't look right - even considering nVidia's tendency to ah, bend the truth when it comes to TDP, the GK104 sips power compared to Tahiti (the 79x0) so something is definitely off when both the 690 and the 680SLI setups are consuming more than the 7970CF.

 

Personally I would be comfortable (in the hypothetical situation in which I was gifted one of these cards) to run it on my 650W PSU, and indeed would be on a 550W model assuming the CPU wasn't overclocked something silly. The thing I'm not comfortable with is spending $1000+ on a card which will probably hit the VRAM ceiling far sooner than it runs out of actual grunt - possibly in the case of a 25x16 screen, and most definitely on any triple-screen setup worth its salt. 4GB 680s are meant to be available about now, the 690 could really use that extra memory.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

In the end it depends on the reasoning for this GK104 chip being released as the 680 in the first place, as the xx4 codenamed chips are normally reserved for the mid-range chip. The specs bear that out - small die, low power consumption, minimal compute power - and it's known that the GK110, the high-end codenamed chip, exists. Now either nV decided to only release the mid-range as their top-end solution because AMD's counterpart this generation was so underwhelming that they didn't need to release the full version of Kepler (and therefore could make a fortune selling a cheap chip for high-end prices), or GK110 was so complex that yield issues made it commercially non-viable to produce and sell at the consumer level (as opposed to releasing them as Quadro chips at several times the price).

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Bang - and just like that, there are only two gaming video cards remaining on the market that are worth buying. Forgive the hyperbole but the GTX670 looks to have obsoleted every single card above the $250 HD7850, on both sides. 7870, 7950, 7970, 680 and 690 now aren't even worth consideration in all but the most obscure fringe cases, unless every card listed there receives *at least* a $50 price cut.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

And your reasoning for this "hyperbole" is? Price vs. performance? Some new benchmarks? :)

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

Anandtech is my go-to site for reliable reviews, this is theirs - but the findings are pretty universal. 7% or less performance difference to the GTX680 and equal to the 7970, $100 price difference to the former and $80 to the latter.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

A $1000 vid card, the new new new high end, made of magnesium alloy and includes built in water chambers

 

http://www.techpower...0-Launched.html

Double that price for the Australian market...

 

This.

 

Magnesium, huh? I know from experience that it burns very brightly when it gets hot :p

 

I think that once you're computer has reached 473 degrees Celsius (autoignition temp of magnesium), you've got other problems besides your graphics card burning.

  • Like 1
Link to comment
Share on other sites

So I just discovered this card requires 300 watts under load. And 150 watts at idle. That's probably the worst power consumption I have ever seen for a processing unit.

 

Not only is the card expensive, but so is the yearly electricity bill!

Link to comment
Share on other sites

It's a lot in absolute terms, but actually less than the previous generation - both the 6990 (dual 6970) and 590 (dual cut-down 580) were significantly more power hungry cards, and also more power hungry individual GPUs. At 170W, a single 680 is the most "efficient" top-end GPU in a long time.

 

A technical aside is that 300W is the paper limit they stick to because of the PCI-E specification, but there's nothing stopping them from enabling an alternate mode that blows right past that - that previous gen, as mentioned, with some voltage hikes and overclocked cores, would blow past 400W - the 6990 in particular had a BIOS switch that when toggled to the alternate mode would essentially unrestrict everything, allowing it to fly past the nominal 450W that its stock cooler was rated for.

 

 

On the plus side at least both vendors are finally working on idle consumption as well. AMD in particular has a "long idle" state it can fall back to which sips just a couple watts. It's offset, unfortunately, by their load power being a fair bit higher than nV's this generation, but still.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Yeah Kepler is designed to be nvidia's most energy efficient chip. It automatically overclocks and underclocks itself depending on what program you are running. When I run Skyrim with HD texture mods + ENB lighting @ 5360x1600, I can feel the back of the 680 card and it's not hot like my old 280s, just warm

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...