Jump to content

Nvidia RTX Series


Bokishi

Recommended Posts

I think I acquired that confusion/assumption from the reaction of the average netizen to benchmarks (usually via disappointment in the numbers...).  eg, I think a lot of lay-people mistake/think of them for actual gameplay fps barometers. Actually, I think I've seen some reviewers act like they were, too, when they talk about whether something is "worth it".   >.>

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

I was always under the impression that it was pretty easy to figure out how "worth it" the card is by measuring the length of your e-peen and comparing it to the previous gen. Seem like a pretty straight forward and quantifiable method.

 

Joking aside depending on the availability of the cards in my country, I think this will be the first component I upgrade for my new setup. Leaning towards the 3080.

  • Like 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

I don't know what people were expecting if the benchmarks are a "disappointment"? This is a pretty substantial improvement over Turing.

Anyway, I'm waiting until AMD shows their hand before making a decision. 

Edited by Keyrock
  • Like 1
  • Thanks 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

3 hours ago, Keyrock said:

I don't know what people were expecting if the benchmarks are a "disappointment"? This is a pretty substantial improvement over Turing.

More to do with them being disappointed re: specific game performance vs. the card overall if that makes sense.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

1 hour ago, AwesomeOcelot said:

I was not able to get a FE card. As far as I can tell they were never on sale.

 

They are saying that the servers were set up to instantly go from for sale to out of stock, at least for the FE cards.

Edited by Sarex

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

5 hours ago, Keyrock said:

I don't know what people were expecting if the benchmarks are a "disappointment"? This is a pretty substantial improvement over Turing.

Think there are multiple issues at play there I think

1) nVidia overegged the pudding. The good thing about using up to [performance metric] is that people don't see the 'up to', the bad thing about it is also that they don't see the 'up to'. 1.9x performance/ watt? 2x performance of the 2080? If you're believing that sort of claim without the 'up to' and context of them being very different use cases then the actual performance is a disappointment, relatively, even if performance is objectively good.

2) While Ampere is a big improvement over Turing in value Turing was a historically poor improvement over Pascal. Ampere's performance/ value gain is a correction from the previous 'bad' generation and is nowhere near as much of an improvement when compared to Pascal. That was largely acknowledged by Jensen at the presentation when he appealed to Pascal owners to upgrade because now there was a point to it.

3) The cards are very hot and hungry. At least some of the people who loved to make jokes about how hot Vega was might be seeing the irony.

4) Not so much for the general user but a lot of people really thought they would be on TSMC 7nm and are disappointed at the tradeoffs Samsung 8nm represents.

Link to comment
Share on other sites

5 hours ago, AwesomeOcelot said:

I was not able to get a FE card. As far as I can tell they were never on sale.

I was at work when the release was set due to an emergency call-in since a colleague was sick. So I tried using the work pc to order, I wasn't trying to get a FE card though, just an AIB board.

...

The potato PC couldn't handle the shopping page, logged me out a few times for good measure aswell. -.-

Kek.

  

1 hour ago, Zoraptor said:

Think there are multiple issues at play there I think

1) nVidia overegged the pudding. The good thing about using up to [performance metric] is that people don't see the 'up to', the bad thing about it is also that they don't see the 'up to'. 1.9x performance/ watt? 2x performance of the 2080? If you're believing that sort of claim without the 'up to' and context of them being very different use cases then the actual performance is a disappointment, relatively, even if performance is objectively good.

2) While Ampere is a big improvement over Turing in value Turing was a historically poor improvement over Pascal. Ampere's performance/ value gain is a correction from the previous 'bad' generation and is nowhere near as much of an improvement when compared to Pascal. That was largely acknowledged by Jensen at the presentation when he appealed to Pascal owners to upgrade because now there was a point to it.

3) The cards are very hot and hungry. At least some of the people who loved to make jokes about how hot Vega was might be seeing the irony.

4) Not so much for the general user but a lot of people really thought they would be on TSMC 7nm and are disappointed at the tradeoffs Samsung 8nm represents.


The 1.9 performance per watt thing they did was really a bad comparison aswell... They did write pretty clearly, up to 2X performance in 4K from what I remember, 1080p was basically never mentioned from what I remember. We're already mostly CPU limited at 1080p anyway, so it's almost irrelevant to compare performance there.

I'm actually thankful for my Vega 64, it didn't really make me think about the power consumption from the 3080. It's the same ballpark really.

 

Edited by Azdeus

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

1 hour ago, Zoraptor said:

Think there are multiple issues at play there I think

1) nVidia overegged the pudding. The good thing about using up to [performance metric] is that people don't see the 'up to', the bad thing about it is also that they don't see the 'up to'. 1.9x performance/ watt? 2x performance of the 2080? If you're believing that sort of claim without the 'up to' and context of them being very different use cases then the actual performance is a disappointment, relatively, even if performance is objectively good.

My general rule of thumb is to take every performance figure given by the manufacturer themselves with a 55 gallon drum of salt. That goes for Nvidia, AMD, Intel, et al.

23 minutes ago, Azdeus said:

The 1.9 performance per watt thing they did was really a bad comparison aswell... They did write pretty clearly, up to 2X performance in 4K from what I remember, 1080p was basically never mentioned from what I remember. We're already mostly CPU limited at 1080p anyway, so it's almost irrelevant to compare performance there.

 

There's no point in getting a 3080 for 1080p gaming at all, unless you are a MLG 420 BLAZE IT PRO GAMER and the difference between 200 FPS and 230 FPS will mean the difference between hitting and missing that 360 no scope sniper shot in CS:GO for you.

  • Haha 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

18 minutes ago, Keyrock said:

My general rule of thumb is to take every performance figure given by the manufacturer themselves with a 55 gallon drum of salt. That goes for Nvidia, AMD, Intel, et al.

There's no point in getting a 3080 for 1080p gaming at all, unless you are a MLG 420 BLAZE IT PRO GAMER and the difference between 200 FPS and 230 FPS will mean the difference between hitting and missing that 360 no scope sniper shot in CS:GO for you.

Yeah, I think it's somewhere around 15% faster in 1080p. You just hit a CPU bottleneck.

Aaanywho, put in an order for an ASUS one to arrive in early November.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

29 minutes ago, Bokishi said:

Haha  it wasn't me, it was these guys using bots

https://twitter.com/BounceAlerts

I'm not one for advocating murder. But someone should shoot the people.

Jesting aside, I wish they'd take some kind of action against these things.

Edited by Azdeus

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Reddit:

Quote

The 3080 Launch Day Experience (TM)

Get up.
Rush to PC.
You've got your 8+ windows up.
Your timezone's launch hits.
Refresh like a crack addict.
Websites immediately crash in unison:
- EVGA ceases to be, remanifests empty, barren
- Nvidia doesn't crash, but apparently never had any to begin with
- Newegg crashes, or when it works, shows you an 'Add to Cart' button just to tease you, let the dots circle meaninglessly
- Best Buy doesn't crash for long, but existing 3080s INSTANT TRANSMISSION out of the real into the noumenal
- B&H no longer exists. It never existed.
- Amazon trolls everyone, either never had any stock or Bezos personally owns all of them now.

And I quote myself from the beginning of this month:

On 9/1/2020 at 1:20 PM, Bartimaeus said:

If past nvidia releases are anything to go by, the 3090s 3080s will probably be out of stock long enough that you'll get news on the AMD series in a couple of months before you have a realistic shot of buying them.

Edited by Bartimaeus
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

4 hours ago, Azdeus said:

I'm actually thankful for my Vega 64, it didn't really make me think about the power consumption from the 3080. It's the same ballpark really.

Vega64 is quite an interesting comparison power usage wise to the 3080. I've had my Vega64 overclocked and it has almost exactly the same performance/ power balance for its last 100W as the 3080 has; ie going from 210W stock to 320W gives ~10% extra performance (plus a decent bit more for memory overclocking) while 3080 going from 220W to 320W (stock) also gives ~10% performance. If Vega was notoriously bad at power scaling at the upper end Ampere pretty much has to be labeled the same; though that may well mean that Ampere is very efficient below the top end as Vega was.

(I'd pretty strongly suspect that 10% is where nVidia expects Big Navi to land. Those coolers on the FE are not cheap and only really necessary because of that extra 100W, and they could balance lower power draw/ performance with it being a 'great overclocker' for those who don't care about heat. Only reason I can see for it is because they expect competition at that level)

Link to comment
Share on other sites

32 minutes ago, Zoraptor said:

Vega64 is quite an interesting comparison power usage wise to the 3080. I've had my Vega64 overclocked and it has almost exactly the same performance/ power balance for its last 100W as the 3080 has; ie going from 210W stock to 320W gives ~10% extra performance (plus a decent bit more for memory overclocking) while 3080 going from 220W to 320W (stock) also gives ~10% performance. If Vega was notoriously bad at power scaling at the upper end Ampere pretty much has to be labeled the same; though that may well mean that Ampere is very efficient below the top end as Vega was.

(I'd pretty strongly suspect that 10% is where nVidia expects Big Navi to land. Those coolers on the FE are not cheap and only really necessary because of that extra 100W, and they could balance lower power draw/ performance with it being a 'great overclocker' for those who don't care about heat. Only reason I can see for it is because they expect competition at that level)

Your Vega draws 210 stock? Mine's a bit over 300 :o

Yeah, nvidia does have a crown to defend so I'm not surprised they pushed it so hard. What with being stuck on 10nm they'd have to do something, so massive dies and massive wattage.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Strix Vega64 has 200/220W stock BIOSes*. So the usual simple overclock is +50% power limit -> 300/330W, I'd presume the BIOS of whoever made yours probably has the power limit upped by default.

(I've been at stock power for the last year, undervolting and memory overclock gets most of the performance increase of taking the power limit off without the card sounding like it's trying to achieve orbit; and a stable memory overclock is more difficult with the extra heat)

*Practical draw averages ~210W on my card

Link to comment
Share on other sites

Yeah, it's not that bad tho', playing with a Vega 64, the 3080 ain't gon' be no problem

giphy.gif

Edited by Azdeus
  • Haha 1

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Fun fact: My first video card was a Voodoo3 3000. I remember playing UT99 in STUNNING 800x600. Back then I also didn't suck at FPS... I wasn't great, but I didn't suck, like I do today.

Good times.

P.S. UT99 is still the best competitive FPS ever. 

Edited by Keyrock
  • Like 1
  • Thanks 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

My first that I remember was a S3 Trio 64V+ 4MB, that I had for years, it was later paired with 1 Voodoo 2, before I upgraded to a TNT2 Ultra. Still kept the Voodoo card in though. Tried a Geforce 2, it didn't work, and I didn't upgrade until I got an X800, then a HD 5850, followed by 7970, followed by Vega 64. Now, for the first time in decades I'm going with nvidia. :(

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Every gen was a game changer back then went from a ATI Rage Fury ('98) to a Voodoo 3('99) to Geforce 2 Pro('01) to a Radeon 9600('04). AMD HD4650 & HD5870. Then GTX 970, 1080, and now RTX 3080.  I tracked the launch of the Vega 56, but it was more expensive than the 64 by the time I wanted an upgrade and they were both very uncompetitive. The 1080 was way cheaper with better performance, lower power, lower noise. I can't wait to try a ray tracing game at 4K DLSS 2.0 @ 120FPS. Right now my 1080 can't output 4K 120hz or GSYNC to my OLED.

  • Like 1
Link to comment
Share on other sites

All of my discrete video cards since the early 2000s have been Nvidia, simply because that's when I switched to Linux. For the longest time the ATI, later AMD, Linux drivers were laughably bad, while Nvidia's Linux drivers were more or less on par with their Windows counterpart. So it really didn't matter if ATI/AMD had superior hardware because it would still perform like ass on Linux. That has changed over the last few years, namely with AMD open-sourcing driver tech specs, something they did many years back, but it took some years to pay off (Valve also had a helping hand in getting AMD drivers up to snuff). These days I'm not stuck with Team Green, there are good Linux drivers for Team Red, and Team Blue, for that matter.

Edited by Keyrock
  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

37 minutes ago, AwesomeOcelot said:

Every gen was a game changer back then went from a ATI Rage Fury ('98) to a Voodoo 3('99) to Geforce 2 Pro('01) to a Radeon 9600('04). AMD HD4650 & HD5870. Then GTX 970, 1080, and now RTX 3080.  I tracked the launch of the Vega 56, but it was more expensive than the 64 by the time I wanted an upgrade and they were both very uncompetitive. The 1080 was way cheaper with better performance, lower power, lower noise. I can't wait to try a ray tracing game at 4K DLSS 2.0 @ 120FPS. Right now my 1080 can't output 4K 120hz or GSYNC to my OLED.

Those were amazing times, the only reason why I bought my Vega 64 was to be honest that a 2080 was nowhere to be found and I didn't want to be without a computer to game on.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Dunno what was in my 486, probably something by Cirrus Logic. After that was a Diamond Stealth 3D 2000 (S3 Virge DX) which had a few 3D features but wasn't truly a 3D accelerator. This was one of the more common 2D cards to pair with a Voodoo, but I never had a 3DFx card of any kind. Richer people would pair one with something from Matrox or Tseng Labs instead, probably.

The next card I had was a Diamond Viper V330 (Nvidia Riva 128). This was a bad card, making all sorts of sacrifices in image quality in order to claim the speed throne from the original Voodoo. I remember the wonder videos from Civ2 looking particularly awful with the dithering this card employed, when even whatever the hell was on my 486 had no problem with it. Shamefully, I fell for Jensen's marketing. As many of us know, similar 'cheats' would be employed by nVidia throughout the following years in order to win benchmarks, so there was a bit of a pattern established.

Undeterred though, my next card would be a Diamond Viper V770 (yes, a pattern is emerging) with the nVidia TNT2 chip. No complaints against this one, it ushered in the age of 32-bit colour when 3DFx stubbornly stuck to 16-bit, which would be the undoing of the company.

After that it's a bit of a blur, as the 00s generally were, with the rapid pace of development meaning very frequent changeovers. I don't know what the timeline is but I had a GeForce 3 Ti500, GeForce 4 Ti4200, Radeon 9800 Pro (it died), GeForce 7900GT (it died), Radeon X1950XT and Geforce 8800GT. I might even be forgetting a card or two, like if I had a GeForce 6 series, it's completely slipped my mind, and I think I had a 5850? But where is it now? Not on my shelf like a couple of its predecessors. I also had a Radeon 8500 "Low Profile" for a couple of days before I realised the shop had pulled a swifty on me by selling it as a 8500LE (which was better than the low profile but slower than the full 8500), and I took it back and got the Ti4200 instead.

The following decade I had the following: Radeon HD7950 and R9 290X. Far from bemoaning the lack of progress, I'm just happy to have saved the money from upgrading every 5 years instead of every, what, 1.5 years?

 

EDIT: Oh that's right, I stuck the 5850 with my old HTPC parts then gave the whole thing away to my sister. And my current HTPC has a 1650 Super in it, but that doesn't really count.

Edited by Humanoid

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...