NVIDIA GeForce GTX 690 Review: Ultra Expensive, Ultra Rare, Ultra Fast
by Ryan Smith on May 3, 2012 9:00 AM ESTIn an unusual move, NVIDIA took the opportunity earlier this week to announce a new 600 series video card before they would be shipping it. Based on a pair of Kepler GK104 GPUs, the GeForce GTX 690 would be NVIDIA’s new flagship dual-GPU video card. And by all metrics it would be a doozy.
Packing a pair of high clocked, fully enabled GK104 GPUs, NVIDIA was targeting GTX 680 SLI performance in a single card, the kind of dual-GPU card we haven’t seen in quite some time. GTX 690 would be a no compromise card – quieter and less power hungry than GTX 680 SLI, as fast as GTX 680 in single-GPU performance, and as fast as GTX 680 SLI in multi-GPU performance. And at $999 it would be the most expensive GeForce card yet.
After the announcement and based on the specs it was clear that GTX 690 had the potential, but could NVIDIA really pull this off? They could, and they did. Now let’s see how they did it.
GTX 690 | GTX 680 | GTX 590 | GTX 580 | |
Stream Processors | 2 x 1536 | 1536 | 2 x 512 | 512 |
Texture Units | 2 x 128 | 128 | 2 x 64 | 64 |
ROPs | 2 x 32 | 32 | 2 x 48 | 48 |
Core Clock | 915MHz | 1006MHz | 607MHz | 772MHz |
Shader Clock | N/A | N/A | 1214MHz | 1544MHz |
Boost Clock | 1019MHz | 1058MHz | N/A | N/A |
Memory Clock | 6.008GHz GDDR5 | 6.008GHz GDDR5 | 3.414GHz GDDR5 | 4.008GHz GDDR5 |
Memory Bus Width | 2 x 256-bit | 256-bit | 2 x 384-bit | 384-bit |
VRAM | 2 x 2GB | 2GB | 2 x 1.5GB | 1.5GB |
FP64 | 1/24 FP32 | 1/24 FP32 | 1/8 FP32 | 1/8 FP32 |
TDP | 300W | 195W | 375W | 244W |
Transistor Count | 2 x 3.5B | 3.5B | 2 x 3B | 3B |
Manufacturing Process | TSMC 28nm | TSMC 28nm | TSMC 40nm | TSMC 40nm |
Launch Price | $999 | $499 | $699 | $499 |
As we mentioned earlier this week during the unveiling of the GTX 690, NVIDIA is outright targeting GTX 680 SLI performance here with the GTX 690, unlike what they did with the GTX 590 which was notably slower. As GK104 is a much smaller and less power hungry GPU than GF110 from the get-go, NVIDIA doesn’t have to do nearly as much binning in order to get suitable chips to keep their power consumption in check. The consequence of course is that much like GTX 680, GTX 690 will be a smaller step up than what NVIDIA has done in previous years (e.g. GTX 295 to GTX 590), as GK104’s smaller size means it isn’t the same kind of massive monster that GF110 was.
In any case, for GTX 690 we’re looking at a base clock of 915MHz, a boost clock of 1019MHz, and a memory clock of 6.006GHz. Compared to the GTX 680 this is 91% of the base clock, 96% of the boost clock, and the same memory bandwidth; this is the closest a dual-GPU NVIDIA card has ever been to its single-GPU counterpart, particularly when it comes to memory bandwidth. Furthermore GTX 690 uses fully enabled GPUs – every last CUDA core and every last ROP is active – so the difference between GTX 690 and GTX 680 is outright the clockspeed difference and nothing more.
Of course this does mean that NVIDIA had to make a clockspeed tradeoff here to get GTX 690 off the ground, but their ace in the hole is going to be GPU Boost, which significantly eats into the clockspeed difference. As we’ll see when we get to our look at performance, in spite of NVIDIA’s conservative base clock the performance difference is frequently closer to the smaller boost clock difference.
As another consequence of using the more petite GK104, NVIDIA’s power consumption has also come down for this product range. Whereas GTX 590 was a 365W TDP product and definitely used most of that power, GTX 690 in its stock configuration takes a step back to 300W. And even that is a worst case scenario, as NVIDIA’s power target for GPU boost of 263W means that power consumption under a number of games (basically anything that has boost headroom) is well below 300W. For the adventurous however the card is overbuilt to the same 365W specification as the GTX 590, which opens up some interesting overclocking opportunities that we’ll get into in a bit.
For these reasons the GTX 690 should (and does) reach performance nearly at parity with the GTX 680 SLI. For that reason NVIDIA has no reason to be shy about pricing and has shot for the moon. The GTX 680 is $499, a pair of GTX 680s in SLI would be $999, and since the GTX 690 is supposed to be a pair of GTX 680s, it too is $999. This makes the GTX 690 the single most expensive consumer video card in the modern era, surpassing even 2008’s GeForce 8800 Ultra. It’s incredibly expensive and that price is going to raise some considerable ire, but as we’ll see when we get to our look at performance NVIDIA has reasonable justification for it – at least if you consider $499 for the GTX 680 reasonable.
Because of its $999 price tag, the GTX 690 has little competition. Besides the GTX 680 in SLI, its only other practical competition is AMD’s Radeon HD 7970 in Crossfire, which at MSRP would be $40 cheaper at $959. We’ve already seen that GTX 680 has clear lead on the 7970, but thanks to differences in Crossfire/SLI scaling that logic will have a wrench thrown in it. But more on that later.
Finally, there’s the elephant in the room: availability. As it stands NVIDIA cannot keep the GTX 680 in stock in North America, and while the GTX 690 may be a very low volume part due to its price, it requires 2 binned GPUs, which are going to be even harder to get. NVIDIA has not disclosed the specific number of cards that will be available for the launch, but after factoring the fact that OEMs will be sharing in this stockpile it’s clear that the retail allocations are certainly going to be small. The best bet for potential buyers is to keep a very close eye on Newegg and other e-tailers, as like the GTX 680 it’s unlikely these cards will stay in stock for long.
The one bit of good news is that while cards will be rare, you won’t need to hunt across many vendors. As with the GTX 590 launch NVIDIA is only using a small number of partners to distribute cards here. For North America this will be EVGA and Asus, and that’s it. So at least unlike the GTX 680 you will only need to watch over two products instead of a dozen. On a broader basis, long term I have no reason to doubt that NVIDIA can produce these cards in sufficient volume when they have plenty of GPUs, but until TSMC’s capacity improves NVIDIA has no chance of meeting the demand for GK104 GPUs or any of the products based off of it.
Spring 2012 GPU Pricing Comparison | |||||
AMD | Price | NVIDIA | |||
$999 | GeForce GTX 690 | ||||
$499 | GeForce GTX 680 | ||||
Radeon HD 7970 | $479 | ||||
Radeon HD 7950 | $399 | GeForce GTX 580 | |||
Radeon HD 7870 | $349 | ||||
$299 | GeForce GTX 570 | ||||
Radeon HD 7850 | $249 | ||||
$199 | GeForce GTX 560 Ti | ||||
$169 | GeForce GTX 560 | ||||
Radeon HD 7770 | $139 |
200 Comments
View All Comments
JPForums - Thursday, May 3, 2012 - link
Not mine. I'm running a 1920x1200 IPS.
1920x1200 is more common in the higher end monitor market.
A quick glance at newegg shows 16 1920x1200 models with at 24" alone. (starting at $230)
Besides, I can't imagine many buy a $1000 dollar video card and pair it with a single $200 display.
It makes more sense to me to check 1920x1200 performance than 1920x1080 for several reasons:
1) 1920x1200 splits the difference between 16x10 and 25x14 or 25x16 better than 1920x1080.
1680x1050 = ~1.7MP
1920x1080=~2MP
1920x1200=~2.3MP
2560*1440=~3.7MP
2560x1600=~4MP
2) People willing to spend $1000 for a video card are generally in a better position to get a nicer monitor. 1920x1200 monitors are more common at higher prices.
3) They already have three of them around to run 5760x1200. Why go get another monitor?
Opinionated Side Points:
Movies transitioned to resolutions much wider than 1080P long ago. A little extra black space really makes no difference.
1920x1200 is a perfectly valid resolution. If Nvidia is having trouble with it, I want to know. When particular resolutions don't scale properly, it is probable that there is either a bug or shenanigans are at work in the more common resolutions.
I prefer using 1920x1200 as a starting point for moving to triple screen setups. I already thing 1920x1080 looks squashed, so 5760x1080 looks downright flattened. Also 3240x1920 just doesn't look very surround to me (3600x1920 seems borderline surround).
CeriseCogburn - Saturday, May 5, 2012 - link
There are only 18 models available in all of newegg with 1920x1200 resolution - only 6 of those are under $400, they are all over $300.+
There are 242 models available in 1920x1080, with nearly 150 models under $300.
You people are literally a bad joke when it comes to even a tiny shred of honesty.
Lerianis - Sunday, May 6, 2012 - link
I don't know about the 'sadly' there in all honesty. I personally like 1920*1080 better than *1200, because nearly everything is done in the former resolution.Stuka87 - Thursday, May 3, 2012 - link
Who buys a GTX690 to play on a 1080P display? Even a 680 is overkill for 1080. You can save a lot of money with a 7870 and still run everything out there.vladanandtechy - Thursday, May 3, 2012 - link
Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)....retrospooty - Thursday, May 3, 2012 - link
"Stuka i agree with you.....but when you buy such a card....you think in the future....5 maybe 6 years....and i can't gurantee that we will do gaming in 1080p then:)...."I have to totally disagree with that. Anyone that pays $500+ for a video card is a certain "type" of buyer. That type of buyer will NEVER wait 5-6 years for an upgrade. That guy is getting the latest and greatest of every other generation, if not every generation of cards.
vladanandtechy - Thursday, May 3, 2012 - link
You shouldn't "totally disagree".......meet me...."the exception"....i am the type of buyer who is looking for the "long run"....but i must confess....if i could....i would be the type of buyer you describe....cyaorionismud - Thursday, May 3, 2012 - link
retrospooty and I mean you no disrespect, but if you're spending $500 and buying for the "long run," you're doing it wrong.If you had spent $250, you could have 80% of the performance for 2.5 years, then spend another $250 and have 200% of the performance for the remaining 2.5 years.
von Krupp - Thursday, May 3, 2012 - link
Don't say that.I bought two (2) HD 7970s on the premise that I'm not going to upgrade them for a good long while. At least four years, probably closer to six. I ran from 2005 to 2012 with a GeForce 7800GT just fine and my single core AMD CPU was actually the larger reason why I needed to move on.
Now granted, I also purchased a snazzy U2711 just so the power of these cards wouldn't go to waste (though I'm quite CPU-bound by this i7-3820), but I don't consider dropping AA in future titles to maintain performance to be that big of a loss; I already only run with 8x AF because , frankly, I'm too busy killing things to notice otherwise. I intend to drive this rig for the same mileage. It costs less for me to buy the best of the best at the time of purchase for $1000 and play it into the ground than it is to keep buying $350 cards to barely keep up every two years, all over a seven year duration. Since I now have this fancy 2560x1440 resolution and want to use it, the $250-$300 offerings don't cut it. And the, don't forget to adjust for inflation year over year.
So yes, I'm going to be waiting between 4 and 6 years to upgrade. Under certain conditions, buying the really expensive stuff is as much of an economical move as it is a power grab. Not all of us who build $3000 computers do it on a regular basis.
P.S. Thank you consoles for extending PC hardware life cycles. Makes it easier to make purchases.
Makaveli - Thursday, May 3, 2012 - link
lol agree let put a $500 videocard with a $200 TN panel at 1920x1080 umm ya no!