Jump to content

Nvidia RTX Series


Bokishi

Recommended Posts

I'll still wait for the 40xx series. Or whenever I can do AAA 4k gaming at 144fps (...or, 8k at 60fps? haha) since 4k at 60+ is good enough for now.  :*

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

Good that they ended up cheaper than rumoured, though even the cheapest of the cards announced today is still ~25% more than I'm prepared to pay for any video card so I'll sit tight for 3060 news. That said, if it might be competitive for the next ten years and no viable option turns up in my price range, perhaps it would be okay - my current card was originally released almost seven years ago after all.

It seems absurd to think of PC hardware potentially aging like that these days, but in no way is that a complaint. It's also admittedly heavily influenced by the type of game I tend to play, with ones truly taxing on graphics hardware coming once every few years at best. I mean, the most graphically intensive game I've seriously played is still The Witcher 3, which is over five years old now. So it's not one or the other, but rather a "fortunate" confluence of relatively slow hardware progression and my disinterest in the majority of graphical heavyweight games over the past decade which together has saved me a ton of cash on PC upgrades. If anything, the majority of my PC-related costs over those years have been for storage, which would have seemed an absurd idea in years past. Hell, I barely keep up with hardware-related news these days and just rely on you guys to give me the skinny on what the current mainstream picks are.

(For context, the 3070 RRP is $809AUD apparently, I'm looking at more the $600AUD range which is where the 5700XT currently sits, but that'd be a silly thing to buy right now)

Edited by Humanoid
  • Like 1

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

^  Yeah, I'm much the same regarding the games I like to play or how frequently I care about playing some uber graphics game, which is why I expected the 2080ti would last me at least 5 years, maybe longer.

There's the added aspect that many graphic settings that eat FPS, I don't give a rip about or even actively dislike and hence never have them on.  Like any kind of blur/depth of field/bloom, having Af at 16x, shadows can be medium, screen space reflections, Nvidia-special effects or what have you.  I've tried all of them and find they make little to zero difference in my enjoyment of playing a game so who needs them (not me, at least).  All I care about is image clarity/sharpness/view distance and 60fps, not special effects.

Death Stranding and BL3 is about as graphic intensive as I've done recently.  Hubby had to run BL3 on DX12 because it runs a bit better on a 1660 gpu/and an ok but older AMD CPU (DX11 he's on all Low at 1080/75% scaling and it stutters, DX12 is low-medium with almost no stutter). He needs to give up and build another Borderlands playing rig especially when we get a new TV.   😛

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

If a 3080 winds up being roughly 25% faster thsn a 2080 Ti (bonus points if it's even faster than that) then that should get me to the 4K ~100 FPS range that I'm targeting. I don't foresee Big Navi having a 3090 competitor, but hopefully AMD has something to compete with the 3080. In an ideal world a combination of competition and cyber week sales allows me to buy a 3080/AMD equivalent for $600 in less than 3 months.

/crosses fingers & toes

  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

2 minutes ago, Bokishi said:

Wondering if 8700K will bottleneck a 3090 card

Time for that unobtainium powered CPU yet?  :lol:

 

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

9 minutes ago, Bokishi said:

Wondering if 8700K will bottleneck a 3090 card

I'm curious if PCIe3.0 will bottleneck a 3090. If so, then the answer to your question is yes.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

1 hour ago, Keyrock said:

I'm curious if PCIe3.0 will bottleneck a 3090. If so, then the answer to your question is yes.

Nvidia rep said by less than a few percent. PCIe2.0 only bottlenecks Turing cards by a few percent. It's also less the higher the resolution, because high frames use more bandwidth than low frames, so 1080p at 200fps bottlenecks more than 4K 60fps. The more CPU work the more bandwidth it uses as well. Then a lot of the time you could prevent the bottleneck if you capped your frame rate, or used adaptive sync.

The engine seems to matter more than the graphical detail. Bad looking games can bottleneck more than good looking games. So it's not preventing good looking games and high FPS, so much as preventing games from spamming the PCIe bus for no reason.

Link to comment
Share on other sites

I'd be a bit skeptical of taking all nVidia claims at face value prior to independent benchmarks. Things like 'up to 1.9x performance/ watt' is clearly, at best, a technically accurate claim for certain cherry picked scenarios with the general case being in the 10-20% range. And RTX I/O is... well, MS bringing fast I/O to Windows in general with maybe some minor improvement due to using RTX specifically instead of general compute. It's going to be on the nextbox, so it will work with AMD too.

1 hour ago, Keyrock said:

If a 3080 winds up being roughly 25% faster thsn a 2080 Ti (bonus points if it's even faster than that) then that should get me to the 4K ~100 FPS range that I'm targeting. I don't foresee Big Navi having a 3090 competitor, but hopefully AMD has something to compete with the 3080. In an ideal world a combination of competition and cyber week sales allows me to buy a 3080/AMD equivalent for $600 in less than 3 months.

/crosses fingers & toes

Well, theoretically, you could have a 16core Zen2, 32GB of RAM and 104CU RDNA2 card for less power draw than a 3090 and (very likely) a decent bit less price if you glued two nextboxes together. Theoretically. It's all about how well RDNA scales at this point, even an 80CU 5900XT card- with good scaling- would be around 3080 level already.

I'd say that nVidia expects Big Navi to land around the 3080, from their pricing and specs. Maybe a bit above it even, since there's a big intermediate price point there to fill with a 3080STi.

 

Link to comment
Share on other sites

10 hours ago, Zoraptor said:

I'd be a bit skeptical of taking all nVidia claims at face value prior to independent benchmarks. Things like 'up to 1.9x performance/ watt' is clearly, at best, a technically accurate claim for certain cherry picked scenarios with the general case being in the 10-20% range. And RTX I/O is... well, MS bringing fast I/O to Windows in general with maybe some minor improvement due to using RTX specifically instead of general compute. It's going to be on the nextbox, so it will work with AMD too.

Well, theoretically, you could have a 16core Zen2, 32GB of RAM and 104CU RDNA2 card for less power draw than a 3090 and (very likely) a decent bit less price if you glued two nextboxes together. Theoretically. It's all about how well RDNA scales at this point, even an 80CU 5900XT card- with good scaling- would be around 3080 level already.

I'd say that nVidia expects Big Navi to land around the 3080, from their pricing and specs. Maybe a bit above it even, since there's a big intermediate price point there to fill with a 3080STi.

 

Well, Digital Foundry confirmed their numbers, even though it was a controlled situation, and Digital Foundry has done sponsored videos before this. I still have enough faith that those numbers are real enough.

The biggest question is if AMD is going to care enough to actually put anything out there, historically people always went with nVidias products even though AMD smashed them in performance, and I don't doubt the same thing will come through if they were to release something slightly better than a 3080, even though the price might be lower.

I'll admit, if they were to produce something similar today, I'd have to see how their new offerings actually compare when it comes to the features. If AMD manages close to the RT performance of RTX I might go with their offerings.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

I've owned more ATI/AMD GPUs than Nvidias, never seen them smash Nvidia in performance. The only time where I though ATI was ahead was when I bought a 9000 series card. In reality, price/performance, it's been very competitive, until recently where AMD has struggled. Nvidia has so often been ahead in bringing features to market. Getting a 970 and 1080 were easy choices for me, and it looks like I'm getting a 3080.

Link to comment
Share on other sites

I'm thinking of going from my 970 to an 3080. I'm definitely upgrading, but am unsure if I should go for the 3070 or the 3080. I dislike that the 3070 is using the older gen memory. Guess I'll wait for reviews like everyone else.

Wonder how the reference coolers will preform compared to the non-reference ones, because the design looks really great.

  • Like 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

51 minutes ago, AwesomeOcelot said:

I've owned more ATI/AMD GPUs than Nvidias, never seen them smash Nvidia in performance. The only time where I though ATI was ahead was when I bought a 9000 series card. In reality, price/performance, it's been very competitive, until recently where AMD has struggled. Nvidia has so often been ahead in bringing features to market. Getting a 970 and 1080 were easy choices for me, and it looks like I'm getting a 3080.

Ontop of my head, the HD5850 was equal to the 470 for 100$ less, and it came out almost a year before, the 460 that was the price competitor was without a chance. I don't remember most of the others, but than one stood out quite alot. If I remember correctly my HD7970 was faster than the Geforce equivalent aswell.

The last nVidia card I owned was a Geforce 2 MX for a few days, went back to my TNT2 for a long while, until I got an Radeon X800

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

1 hour ago, Sarex said:

I'm thinking of going from my 970 to an 3080. I'm definitely upgrading, but am unsure if I should go for the 3070 or the 3080. I dislike that the 3070 is using the older gen memory. Guess I'll wait for reviews like everyone else.

Wonder how the reference coolers will preform compared to the non-reference ones, because the design looks really great.

I may go for cheapest, I run 1920x1200 so pretty sure any are overkill for the games I play.

  • Like 1

Why has elegance found so little following? Elegance has the disadvantage that hard work is needed to achieve it and a good education to appreciate it. - Edsger Wybe Dijkstra

Link to comment
Share on other sites

1 hour ago, Azdeus said:

Ontop of my head, the HD5850 was equal to the 470 for 100$ less, and it came out almost a year before, the 460 that was the price competitor was without a chance. I don't remember most of the others, but than one stood out quite alot. If I remember correctly my HD7970 was faster than the Geforce equivalent aswell.

The last nVidia card I owned was a Geforce 2 MX for a few days, went back to my TNT2 for a long while, until I got an Radeon X800

I bought the HD5870 that came out the same month as the HD5850. The GTX470 came out 6 months later, I wouldn't say it was equal to the GTX470. Price/performance was really good for the HD5850, and bad for the 470, so the difference is large. I seem to remember availability being a problem with the HD5850. On paper, the MSRP and performance of the Vega56 was better than Nvidia, I couldn't get one, and when I could, it was more expensive than the GTX1080 I bought. Getting that 1080 was the best deal I ever got because it was around $400, I mined ETH for a month and sold it for around $100.

Link to comment
Share on other sites

4 hours ago, Malcador said:

I may go for cheapest, I run 1920x1200 so pretty sure any are overkill for the games I play.

I'm running a 2k monitor that has a 165 refresh rate so I don't know, I think the 3070 would do the trick and I don't play games that much anymore, but on the other hand the card is going to last me for about 4 years probably so it may not be a bad idea to go for the 3080.

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

I have vastly inferior hardware to everyone here, I see. 😛

Why has elegance found so little following? Elegance has the disadvantage that hard work is needed to achieve it and a good education to appreciate it. - Edsger Wybe Dijkstra

Link to comment
Share on other sites

When the pandemic ends and people can gather in crowds again, if you're Jensen Huang why would you go back to announcing things at trade shows when you can just do it from your MAGNIFICENT kitchen and flex on the haterz?

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

49 minutes ago, Malcador said:

I have vastly inferior hardware to everyone here, I see. 😛

The people with over-the-top hardware are the ones that usually advertise it the most, and...water is wet, :p. I'm continuing to run an older i7-4770k and am perfectly happy with a mid-end 1060...although I have 32GB of RAM because I'm a memory hog, so there is that at least.

Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

Incase anyone here missed it, Lenovo made a booboo and listed a 3070 16Gb version

https://imgur.com/8nTInW0

8nTInW0.png

  • Hmmm 1

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Apropos the purported higher memory 3080 and 3070 models, it all comes down to pricing for me. If the 20 GB model is $10 higher than the 10 GB, sure I'll pay the extra $10. If it's $100 more, I'll stick with 10 GB.

All this is assuming that I go Team Green. Supposedly Big Navi is going to release at around the same time as Zen 3 Vermeer, and I'm waiting on the latter anyway, so I might as well check out what Lisa Su (I wonder what her kitchen looks like) has to offer as far as GPUs go before making a decision.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

2 hours ago, Keyrock said:

Apropos the purported higher memory 3080 and 3070 models, it all comes down to pricing for me. If the 20 GB model is $10 higher than the 10 GB, sure I'll pay the extra $10. If it's $100 more, I'll stick with 10 GB.

All this is assuming that I go Team Green. Supposedly Big Navi is going to release at around the same time as Zen 3 Vermeer, and I'm waiting on the latter anyway, so I might as well check out what Lisa Su (I wonder what her kitchen looks like) has to offer as far as GPUs go before making a decision.

I'd pay like 30$ extra for 20Gb, I especially for the 3070 model that doesn't have GDDR6X. Any more than that I'd go for the 3080, which is suppose is nvidias plan really.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Nvidia's development team were asked about the memory size and they said for 4K AAA games only use 4-6GB. Considering the 16GB shared memory of the consoles, I can't see people needing more than 10GB for AAA gaming at 4K. They said they chose higher speed memory over higher memory size because it's more important for running games. If they turn around and release higher memory cards, they're going to have some explaining to do.

  • Like 1
Link to comment
Share on other sites

They wouldn't really have to explain beyond there being demand for cards with more memory- which there clearly is, whether they need it or not.

6 hours ago, Azdeus said:

I'd pay like 30$ extra for 20Gb, I especially for the 3070 model that doesn't have GDDR6X. Any more than that I'd go for the 3080, which is suppose is nvidias plan really.

I'd presume a putative 3070Ti would come more or less half way in price between 3070 and 3080. Minimum price difference would surely have to be $50, wouldn't be much point in the segmentation otherwise.

Link to comment
Share on other sites

46 minutes ago, Zoraptor said:

They wouldn't really have to explain beyond there being demand for cards with more memory- which there clearly is, whether they need it or not.

I'd presume a putative 3070Ti would come more or less half way in price between 3070 and 3080. Minimum price difference would surely have to be $50, wouldn't be much point in the segmentation otherwise.

As Awecelot above said though, there'd be a point presumably for a 3070 with 16gb of memory since they only use GDDR6 on those SKU's. If you get that close to the 3080 price, you might aswell get the 3080, that's not only faster in itself, it carries faster memory.

 

Also, I didn't pay a whole lot attention on the storage buffer technology they're going to use, I was really tired at that point. I wonder if they could use the VRAM as a huge storage buffer for textures and such instead of streaming directly from an SSD.

Edited by Azdeus

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...