Jump to content

Nvidia RTX Series


Bokishi

Recommended Posts

AMD has publicly said that there won't be a 7nm Vega and that they won't be trying to compete with nVidia for the near future. Barring them releasing an actual ray tracing card or something that completely smashes nVidia to the point where even the generally clueless people will pick up on AMD's release, and we are talking something like 5 or 10 to 1 performance here, they will not be able to sell anything..

 

Also,  why would a discerning consumer go buy a new 580 when there will soon be used 1070/1080's on the market with better performance and for less money?

 

PC Gamers had their chance to keep the market healthy back when, but people still bought nVidia cards even though AMD had better performance to offer for lower price.

 

Edit; Mixed ujp the 7nm Vega with something else, they are releasing that

Edited by Azdeus

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

Yeah, they're definitely doing 7nm Vega. Consumer cards were semi announced via roadmap last year but they've studiously avoided saying anything about them since or officially; pro level cards are definite and at the Engineering Sample stage. Now the RTX are announced we may hear something, not that it will help much. nVidia clearly had 1080Ti ready for Vega last time, and they have almost limitless scope for dropping prices and tuning their offerings if they need to. There will probably be Polaris 30 as well, so 680s etc.

 

I really can't see ray tracing mattering much though, if the performance is bad. It's a hard sell spending 1000usd on a card to get 60 fps 1080 when a $200 card will do that with no ray tracing, worse still spending 1000 extra on a gsync 4k 144Hz to play at... 15fps (mathematically implied) with ray tracing on.

 

AMD's big advantage will be supplying to consoles, and making CPUs as well. A Zen2 6/8 core 'proper' APU with a bunch of Navi/ Vega cores should be pretty compelling. They've also managed a paradigm shift- one of the few times that phrase is actually appropriate- in cpus that nobody expected and went from being zero competition to very serious competition, albeit with some own goaling from Intel.

Link to comment
Share on other sites

I don't understand the response to Real-Time Ray Tracing. The demos look insane, it's a genuine leap in lighting technology we haven't seen for years. There's questions of how widely adopted it will be, the performance hit, but I want to wait for independent review. DLSS seems to be super-sampling for "free", meaning almost double the performance of Pascal on similar image quality. This obviously comes at the cost of bigger more complicated chips with tensor cores, prices have increased, although the price is also a function of not having any competition in the high end GPU space.

Link to comment
Share on other sites

This is not the first time that Nvidia has hyped some new tech to sell new cards that ended up not being all that noticeable or even being something you would want to turn off.

  • Like 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

Which one actually looks more realistic, rtx on or rtx off?

 

That meme specifically- I presume- refers to the old bloom memes and applications from games like Morrowind where anything vaguely pale or reflective glowed fluorescently and looked comical rather than improving the game's graphics. Like most new graphics tech, and probably ray tracing, bloom was hugely overused and looked extremely 'fake' when first introduced because it was being used for marketing and as a buzz phrase not because it was actually useful. Happens all the time too, sometimes the tech ends up being useful in the end- after making every game person tenish years ago look like they used vaseline for skincare per pixel shading* is a lot better- sometimes it kind of fades away. There are a metric asteriskton of stuff that either is or was primarily annoying/ unrealistic though- bloom, hdr, motion blur, ssao, god rays, lens flare, tesselation (also added a lot to tank AMD's performance...), per pixel shading etc.

 

*might have also been normal mapping there? My memory is a bit sketchy, but I remember every person in bioshock era games looking like they were sculpted from plasticene then painted with wet transparent lacquer.

Edited by Zoraptor
  • Like 1
Link to comment
Share on other sites

I don't understand the response to Real-Time Ray Tracing.

Probably because more people care about fps and gameplay then they do "ultra realistic" lighting/shadow effects or whatever.

 

There's also a certain point where many consider game-graphic capability "good enough" - for looks, for immersion, for function. It's not that they can't admire more graphical advances, but that they don't care about paying for it or having so much industry/financial focus be put on it vs. actual gameplay/stories and so on.

 

This is not the first time that Nvidia has hyped some new tech to sell new cards that ended up not being all that noticeable or even being something you would want to turn off.

Also, this.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

My outrage...  outrage is probably too strong a word, distaste is a better word...  My distaste for Nvidia RTX has little to do with RTX itself.  I mean I do a bit of an eyeroll when people, Nvidia and others alike, try to pass RTX off as actually just ray tracing.  Just to be clear, RTX is not actually ray tracing, in the purest sense, it's a hybrid of ray tracing and rasterization.  So there is that bit of stretching the truth, but that doesn't really put me off that much because the technology, hybrid as it may be, is still quite impressive in that it is able to produce results very similar to pure ray tracing at a fraction of the computing power. 

 

What puts me off are the prices, they are just ridiculous, new technology or not.  Realistically, you can expect very few games in the next couple of years that take advantage of RTX and fewer still that are playable at even just 1080p at max settings and RTX on with any single card, so for most people the whole RTX thing is kind of a non-issue, what matters to most people are performance gains over Pascal and we just don't know yet.  Nvidia themselves will tell you things like 6X the performance and 35% to 125% gains over Pascal, but I never trust the manufacturer themselves (whether it's Nvidia, AMD, or Intel) to give me realistic numbers because they usually cite fringe results and benchmarks things in extremely specific environments that will yield the most favorable results possible, and these often do not mirror typical real world situations.

 

The bottom line is that we just don't know yet how these cards perform.  We can speculate from core counts, clock speeds, and memory bandwith, but until we get reliable independent benchmarks, we just don't know.  It's going to take a pretty huge leap in performance over Pascal to justify the price tags in my eyes.

  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

My outrage...  outrage is probably too strong a word, distaste is a better word...  My distaste for Nvidia RTX has little to do with RTX itself.  I mean I do a bit of an eyeroll when people, Nvidia and others alike, try to pass RTX off as actually just ray tracing.  Just to be clear, RTX is not actually ray tracing, in the purest sense, it's a hybrid of ray tracing and rasterization.  So there is that bit of stretching the truth, but that doesn't really put me off that much because the technology, hybrid as it may be, is still quite impressive in that it is able to produce results very similar to pure ray tracing at a fraction of the computing power. 

 

What puts me off are the prices, they are just ridiculous, new technology or not.  Realistically, you can expect very few games in the next couple of years that take advantage of RTX and fewer still that are playable at even just 1080p at max settings and RTX on with any single card, so for most people the whole RTX thing is kind of a non-issue, what matters to most people are performance gains over Pascal and we just don't know yet.  Nvidia themselves will tell you things like 6X the performance and 35% to 125% gains over Pascal, but I never trust the manufacturer themselves (whether it's Nvidia, AMD, or Intel) to give me realistic numbers because they usually cite fringe results and benchmarks things in extremely specific environments that will yield the most favorable results possible, and these often do not mirror typical real world situations.

 

The bottom line is that we just don't know yet how these cards perform.  We can speculate from core counts, clock speeds, and memory bandwith, but until we get reliable independent benchmarks, we just don't know.  It's going to take a pretty huge leap in performance over Pascal to justify the price tags in my eyes.

Im pretty turned off by this as well, and will be passing on this Generation of Cards.  1. because I just recently purchased a 1080ti, 2. I dont care about RayTracing Hype.  I want my games to be stable and not fps drop down to 10-20 fps erratically because a developer couldnt account for what effects would bottleneck a piece of hardware and when.  I have issues with enough games today using experimental tech that end up performing terribly *Cough*Cough Pillars of Eternity II *Cough.

 

i've spent probably well over $4000 for my system, (Closer to $6k Including Monitors) so money isnt an issue for me, but there has to be a point where you call BS, and $1200 for that video card is "my" point.

 

Nvidia holds a conference/release party and Jensen spends an hour to an hour and a half giving people/press a borderline college course on RayTracing and the history of Computing Graphics.  He has to paint this picture with that information because without it, him showing us the new overpriced cards, well, No one would give a flying bird butt about it and move on with their day.

 

So, New over hyped, Over Priced cards from NVIDIA are being released.  They have this new lighting tech (that would require developers to actually implement in their games to be useful) <- meaning only the games they showed at the end of that "lecture" will have this "feature"  How many of those games were you actually interested in?  for me it was only Metro.  And whos to say that the tech is actually implemented properly in that game?  they demoed like 1 scene from each of the 3 games, and to be honest, it did not blow my mind.  That being said it will take years for developers to make new games show casing this tech, it will probably be longer still for them to become proficient enough to implement the tech "properly"

 

And these cards also have this Deep Learning Core tech.  He didnt really explain how this is implemented on the card.  Sure he "lectured" on what Deep Learning is to PC graphics.  But how does a Deep Learning Process cultivated by some super computer somewhere benefit the Card that I have installed in my PC?  is there a seperate set of Drivers just for that chip on the card? will an Internet connection be required for this chip to get Deep Learning AI updateds from the Cloud or from NVIDIA servers?  Will it be a once a month/week small update? or a large file update?  or will the AI profile for a game require a constant streaming internet connection?  Seems suspect.

 

Also, Jensen kept stressing that they need a new way to measure Performance because of this card, He said it several times throughout the lecture.  Probably because if you took a standard game today (one that doesnt have some special experimental tech like raytracing and DLSS) this card would most likely not "perform" like a $1200 card. 

 

In Summary, I think NVIDIA is just cash grabbing with this this card, because honestly they have no competition.  I already feel like they cash grabbed a bit with the 1080 series by not mentioning how abysmal the SLI support is for them.  really dissapointed in the state of gaming and gaming hardware at the moment,  Hopefully im wrong...

  • Like 1
Link to comment
Share on other sites

My favourite bit about the presentation they did, was the slides comparing 2080 to 1080 in 4K. No wonder that the 2080 is faster with its 8Gb GDDR6(448Gb/s) vs 8Gb GDDR5(320Gb/s) so I wonder what the chart would look like when compared to a 1080Ti(11gb 484Gb/s)...

Also, some of the benchmarks was done with HDR which hits the 1080's performance quite a bit, on average 10%.

 

I wouldn't worry about the RTX tech not being used though, just think about how many games has gameworks in it, it's probably going to be everywhere and the devs are probably going to spend less energy on traditional lighting, most possibly on nVidias insistance.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

I think ray tracing has the potential to be far more influential to gameplay than most of the advances over the last 10 years. Realistic lighting that's not baked in would be pretty important to stealth games. You could have reflections and shadows in ways not possible before.

Real time ray tracing means a lot less time is dedicated to baking in ray tracing and shaders for edge cases, it should free up development time and allow for far more dynamic and fluid development as you can change a map without having to redo all the assets, you don't even have to involve the artists to change geometry, you won't be able to screw up the lighting.

 

Procedural maps could be much different in a real time ray traced world, if things can be changed without so much being baked in.

 

I don't think we know the performance hit of real time ray tracing because no developer has a final release, or the release drivers, the BF5 demo apparently wasn't even using important features of Turing, partly because they were using the Titan V for development.

 

Price has far more to do with over supply of Pascal GPUs due to mining and lack of competition from AMD. I think people's negative attitude is mainly due to this, and it has nothing to do with the new technology in these cards.

Edited by AwesomeOcelot
Link to comment
Share on other sites

https://www.hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/

 

It would seem that nVidia is continuing their trend of shady practices by forcing reviewers to only write the things that suit nVidia.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to comment
Share on other sites

  • 4 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...