Jump to content

Welcome to Obsidian Forum Community
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

Nvidia RTX Series


  • Please log in to reply
67 replies to this topic

#21
Azdeus

Azdeus

    (6) Magician

  • Members
  • 702 posts
  • Location:Sweden. The ass-end of nowhere, on the right hand side.

They have publicly claimed 'up to 6x more powerful' on their product page; although that could easily be either versus a previous gen running ray tracing via software, or compared to a ddr3 1030 for all we know. I'd suspect AMD claiming Vega64 > 1080Ti based on TFLOPS would be more practically honest than the 6x claim.

 

Seems you were bang on. The new 2080's are up to 6x faster for raytracing than the 1080's according to that video teknoman posted.



#22
Hulk'O'Saurus

Hulk'O'Saurus

    (6) Magician

  • Members
  • 688 posts
  • Location:Cuitztli's Awesome Herbs
  • Steam:Yaz Mataz

Opens up options and switches ray tracing off.

 

There... I will stay away from those gizmos for a while :D.



#23
LadyCrimson

LadyCrimson

    Obsidian VIP

  • Members
  • 9061 posts
  • Location:Candyland
  • Pillars of Eternity Gold Backer
  • Kickstarter Backer

Opens up options and switches ray tracing off.

That's pretty much my take on it. I've seen that Metro demo video and was not impressed by some kind of vast improvement on "realism." I'd much rather they work on gpu's (and pc hardware in general) that can handle mega quality textures everywhere (vs. some being muddy, some great). Like in that Metro video there was a beautiful patch of wavy grass and lying in that was a zombie with muddy clothing textures. Way to ruin the look.

 

Anyway...ray tracing could enhance some stuff but it all depends on what and how game dev's implement it I'd imagine. Even today, some games overdo bloom and similar (and won't let you turn it off) while others use it more pleasantly.


Edited by LadyCrimson, 22 August 2018 - 07:05 AM.


#24
Azdeus

Azdeus

    (6) Magician

  • Members
  • 702 posts
  • Location:Sweden. The ass-end of nowhere, on the right hand side.



Some fairly interesting analysis of the information around. Though Bokishi, you should stay away from this video. ;)

 

Also; RIP AMD GPU's.


Edited by Azdeus, 22 August 2018 - 11:39 AM.


#25
Zoraptor

Zoraptor

    Arch-Mage

  • Members
  • 2644 posts
  • Pillars of Eternity Backer
  • Kickstarter Backer
  • Deadfire Backer
  • Fig Backer

The 20 series pricing is about the best thing that could have happened for AMD under the circumstances, since they're competing for people who would already be buying 1080Ti+ performance/ price cards it's irrelevant to AMD, at this point- cheaper releases would have been far worse for AMD. I'd expect a consumer 7nm release from AMD as well which ought to improve Vega's efficiency and performance significantly.

 

The ultimate problem for AMD is that nVidia will have massive scope to cut pricing which they don't have, but Jensen doesn't seem interested in offering 'value' propositions until forced to.



#26
Azdeus

Azdeus

    (6) Magician

  • Members
  • 702 posts
  • Location:Sweden. The ass-end of nowhere, on the right hand side.

AMD has publicly said that there won't be a 7nm Vega and that they won't be trying to compete with nVidia for the near future. Barring them releasing an actual ray tracing card or something that completely smashes nVidia to the point where even the generally clueless people will pick up on AMD's release, and we are talking something like 5 or 10 to 1 performance here, they will not be able to sell anything..

 

Also,  why would a discerning consumer go buy a new 580 when there will soon be used 1070/1080's on the market with better performance and for less money?

 

PC Gamers had their chance to keep the market healthy back when, but people still bought nVidia cards even though AMD had better performance to offer for lower price.

 

Edit; Mixed ujp the 7nm Vega with something else, they are releasing that


Edited by Azdeus, 22 August 2018 - 03:28 PM.


#27
Zoraptor

Zoraptor

    Arch-Mage

  • Members
  • 2644 posts
  • Pillars of Eternity Backer
  • Kickstarter Backer
  • Deadfire Backer
  • Fig Backer

Yeah, they're definitely doing 7nm Vega. Consumer cards were semi announced via roadmap last year but they've studiously avoided saying anything about them since or officially; pro level cards are definite and at the Engineering Sample stage. Now the RTX are announced we may hear something, not that it will help much. nVidia clearly had 1080Ti ready for Vega last time, and they have almost limitless scope for dropping prices and tuning their offerings if they need to. There will probably be Polaris 30 as well, so 680s etc.

 

I really can't see ray tracing mattering much though, if the performance is bad. It's a hard sell spending 1000usd on a card to get 60 fps 1080 when a $200 card will do that with no ray tracing, worse still spending 1000 extra on a gsync 4k 144Hz to play at... 15fps (mathematically implied) with ray tracing on.

 

AMD's big advantage will be supplying to consoles, and making CPUs as well. A Zen2 6/8 core 'proper' APU with a bunch of Navi/ Vega cores should be pretty compelling. They've also managed a paradigm shift- one of the few times that phrase is actually appropriate- in cpus that nobody expected and went from being zero competition to very serious competition, albeit with some own goaling from Intel.



#28
Gorgon

Gorgon

    Forum Moderator

  • Moderators
  • 4739 posts

hYrtgNm.jpg


  • Pidesco, ShadySands, Azdeus and 1 other like this

#29
Keyrock

Keyrock

    Obsidian Order Rodent Tamer

  • Members
  • 7462 posts
  • Location:The Queen City
  • Steam:Keyrock
  • PSN Portable ID:Unfrozen_Keyrock
  • Pillars of Eternity Backer

The RTX memes are great.

 

3-31-800x277.jpg

 

ajEeVm0_700b.jpg


  • Pidesco, Gorgon, ShadySands and 3 others like this

#30
AwesomeOcelot

AwesomeOcelot

    (9) Sorcerer

  • Members
  • 1318 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer
I don't understand the response to Real-Time Ray Tracing. The demos look insane, it's a genuine leap in lighting technology we haven't seen for years. There's questions of how widely adopted it will be, the performance hit, but I want to wait for independent review. DLSS seems to be super-sampling for "free", meaning almost double the performance of Pascal on similar image quality. This obviously comes at the cost of bigger more complicated chips with tensor cores, prices have increased, although the price is also a function of not having any competition in the high end GPU space.

#31
Sarex

Sarex

    (12) Mage

  • Members
  • 1902 posts

This is not the first time that Nvidia has hyped some new tech to sell new cards that ended up not being all that noticeable or even being something you would want to turn off.


  • Azdeus likes this

#32
Gfted1

Gfted1

    Forum Moderator

  • Moderators
  • 5892 posts
  • Location:Chicago, IL
  • Pillars of Eternity Backer
  • Kickstarter Backer

I assume the second meme is "you'll be homeless after affording to buy one" but whats the first one? RTX makes skin textures look shiny?



#33
Zoraptor

Zoraptor

    Arch-Mage

  • Members
  • 2644 posts
  • Pillars of Eternity Backer
  • Kickstarter Backer
  • Deadfire Backer
  • Fig Backer

Which one actually looks more realistic, rtx on or rtx off?

 

That meme specifically- I presume- refers to the old bloom memes and applications from games like Morrowind where anything vaguely pale or reflective glowed fluorescently and looked comical rather than improving the game's graphics. Like most new graphics tech, and probably ray tracing, bloom was hugely overused and looked extremely 'fake' when first introduced because it was being used for marketing and as a buzz phrase not because it was actually useful. Happens all the time too, sometimes the tech ends up being useful in the end- after making every game person tenish years ago look like they used vaseline for skincare per pixel shading* is a lot better- sometimes it kind of fades away. There are a metric asteriskton of stuff that either is or was primarily annoying/ unrealistic though- bloom, hdr, motion blur, ssao, god rays, lens flare, tesselation (also added a lot to tank AMD's performance...), per pixel shading etc.

 

*might have also been normal mapping there? My memory is a bit sketchy, but I remember every person in bioshock era games looking like they were sculpted from plasticene then painted with wet transparent lacquer.


Edited by Zoraptor, 24 August 2018 - 12:57 PM.

  • Gfted1 likes this

#34
LadyCrimson

LadyCrimson

    Obsidian VIP

  • Members
  • 9061 posts
  • Location:Candyland
  • Pillars of Eternity Gold Backer
  • Kickstarter Backer

I don't understand the response to Real-Time Ray Tracing.

Probably because more people care about fps and gameplay then they do "ultra realistic" lighting/shadow effects or whatever.

 

There's also a certain point where many consider game-graphic capability "good enough" - for looks, for immersion, for function. It's not that they can't admire more graphical advances, but that they don't care about paying for it or having so much industry/financial focus be put on it vs. actual gameplay/stories and so on.

 

This is not the first time that Nvidia has hyped some new tech to sell new cards that ended up not being all that noticeable or even being something you would want to turn off.

Also, this.



#35
Keyrock

Keyrock

    Obsidian Order Rodent Tamer

  • Members
  • 7462 posts
  • Location:The Queen City
  • Steam:Keyrock
  • PSN Portable ID:Unfrozen_Keyrock
  • Pillars of Eternity Backer

My outrage...  outrage is probably too strong a word, distaste is a better word...  My distaste for Nvidia RTX has little to do with RTX itself.  I mean I do a bit of an eyeroll when people, Nvidia and others alike, try to pass RTX off as actually just ray tracing.  Just to be clear, RTX is not actually ray tracing, in the purest sense, it's a hybrid of ray tracing and rasterization.  So there is that bit of stretching the truth, but that doesn't really put me off that much because the technology, hybrid as it may be, is still quite impressive in that it is able to produce results very similar to pure ray tracing at a fraction of the computing power. 

 

What puts me off are the prices, they are just ridiculous, new technology or not.  Realistically, you can expect very few games in the next couple of years that take advantage of RTX and fewer still that are playable at even just 1080p at max settings and RTX on with any single card, so for most people the whole RTX thing is kind of a non-issue, what matters to most people are performance gains over Pascal and we just don't know yet.  Nvidia themselves will tell you things like 6X the performance and 35% to 125% gains over Pascal, but I never trust the manufacturer themselves (whether it's Nvidia, AMD, or Intel) to give me realistic numbers because they usually cite fringe results and benchmarks things in extremely specific environments that will yield the most favorable results possible, and these often do not mirror typical real world situations.

 

The bottom line is that we just don't know yet how these cards perform.  We can speculate from core counts, clock speeds, and memory bandwith, but until we get reliable independent benchmarks, we just don't know.  It's going to take a pretty huge leap in performance over Pascal to justify the price tags in my eyes.


  • Azdeus likes this

#36
Arumite

Arumite

    (0) Nub

  • Initiates
  • 1 posts

My outrage...  outrage is probably too strong a word, distaste is a better word...  My distaste for Nvidia RTX has little to do with RTX itself.  I mean I do a bit of an eyeroll when people, Nvidia and others alike, try to pass RTX off as actually just ray tracing.  Just to be clear, RTX is not actually ray tracing, in the purest sense, it's a hybrid of ray tracing and rasterization.  So there is that bit of stretching the truth, but that doesn't really put me off that much because the technology, hybrid as it may be, is still quite impressive in that it is able to produce results very similar to pure ray tracing at a fraction of the computing power. 

 

What puts me off are the prices, they are just ridiculous, new technology or not.  Realistically, you can expect very few games in the next couple of years that take advantage of RTX and fewer still that are playable at even just 1080p at max settings and RTX on with any single card, so for most people the whole RTX thing is kind of a non-issue, what matters to most people are performance gains over Pascal and we just don't know yet.  Nvidia themselves will tell you things like 6X the performance and 35% to 125% gains over Pascal, but I never trust the manufacturer themselves (whether it's Nvidia, AMD, or Intel) to give me realistic numbers because they usually cite fringe results and benchmarks things in extremely specific environments that will yield the most favorable results possible, and these often do not mirror typical real world situations.

 

The bottom line is that we just don't know yet how these cards perform.  We can speculate from core counts, clock speeds, and memory bandwith, but until we get reliable independent benchmarks, we just don't know.  It's going to take a pretty huge leap in performance over Pascal to justify the price tags in my eyes.

Im pretty turned off by this as well, and will be passing on this Generation of Cards.  1. because I just recently purchased a 1080ti, 2. I dont care about RayTracing Hype.  I want my games to be stable and not fps drop down to 10-20 fps erratically because a developer couldnt account for what effects would bottleneck a piece of hardware and when.  I have issues with enough games today using experimental tech that end up performing terribly *Cough*Cough Pillars of Eternity II *Cough.

 

i've spent probably well over $4000 for my system, (Closer to $6k Including Monitors) so money isnt an issue for me, but there has to be a point where you call BS, and $1200 for that video card is "my" point.

 

Nvidia holds a conference/release party and Jensen spends an hour to an hour and a half giving people/press a borderline college course on RayTracing and the history of Computing Graphics.  He has to paint this picture with that information because without it, him showing us the new overpriced cards, well, No one would give a flying bird butt about it and move on with their day.

 

So, New over hyped, Over Priced cards from NVIDIA are being released.  They have this new lighting tech (that would require developers to actually implement in their games to be useful) <- meaning only the games they showed at the end of that "lecture" will have this "feature"  How many of those games were you actually interested in?  for me it was only Metro.  And whos to say that the tech is actually implemented properly in that game?  they demoed like 1 scene from each of the 3 games, and to be honest, it did not blow my mind.  That being said it will take years for developers to make new games show casing this tech, it will probably be longer still for them to become proficient enough to implement the tech "properly"

 

And these cards also have this Deep Learning Core tech.  He didnt really explain how this is implemented on the card.  Sure he "lectured" on what Deep Learning is to PC graphics.  But how does a Deep Learning Process cultivated by some super computer somewhere benefit the Card that I have installed in my PC?  is there a seperate set of Drivers just for that chip on the card? will an Internet connection be required for this chip to get Deep Learning AI updateds from the Cloud or from NVIDIA servers?  Will it be a once a month/week small update? or a large file update?  or will the AI profile for a game require a constant streaming internet connection?  Seems suspect.

 

Also, Jensen kept stressing that they need a new way to measure Performance because of this card, He said it several times throughout the lecture.  Probably because if you took a standard game today (one that doesnt have some special experimental tech like raytracing and DLSS) this card would most likely not "perform" like a $1200 card. 

 

In Summary, I think NVIDIA is just cash grabbing with this this card, because honestly they have no competition.  I already feel like they cash grabbed a bit with the 1080 series by not mentioning how abysmal the SLI support is for them.  really dissapointed in the state of gaming and gaming hardware at the moment,  Hopefully im wrong...


  • Azdeus likes this

#37
Azdeus

Azdeus

    (6) Magician

  • Members
  • 702 posts
  • Location:Sweden. The ass-end of nowhere, on the right hand side.

My favourite bit about the presentation they did, was the slides comparing 2080 to 1080 in 4K. No wonder that the 2080 is faster with its 8Gb GDDR6(448Gb/s) vs 8Gb GDDR5(320Gb/s) so I wonder what the chart would look like when compared to a 1080Ti(11gb 484Gb/s)...

Also, some of the benchmarks was done with HDR which hits the 1080's performance quite a bit, on average 10%.

 

I wouldn't worry about the RTX tech not being used though, just think about how many games has gameworks in it, it's probably going to be everywhere and the devs are probably going to spend less energy on traditional lighting, most possibly on nVidias insistance.



#38
Bokishi

Bokishi

    Graphics Lord

  • Members
  • 6446 posts
  • Location:Hutt Space
Cyberpunk 2077 uses raytracing. But by the time it releases I'll probably be on the RTX 4080 =]

1535467578etgqfq7sbp_1_1_l.jpg

Edited by Bokishi, 28 August 2018 - 10:33 AM.


#39
Gorgon

Gorgon

    Forum Moderator

  • Moderators
  • 4739 posts

HD though, and barely managing. They are not going to optimise their way out of the performance hit in this generation of cards. Wouldn't you rather play in 4k.

 

I mean, assuming you don't want to buy a card that costs as much as a completely new build. 



#40
AwesomeOcelot

AwesomeOcelot

    (9) Sorcerer

  • Members
  • 1318 posts
  • Pillars of Eternity Silver Backer
  • Kickstarter Backer
  • Deadfire Silver Backer
  • Fig Backer

I think ray tracing has the potential to be far more influential to gameplay than most of the advances over the last 10 years. Realistic lighting that's not baked in would be pretty important to stealth games. You could have reflections and shadows in ways not possible before.

Real time ray tracing means a lot less time is dedicated to baking in ray tracing and shaders for edge cases, it should free up development time and allow for far more dynamic and fluid development as you can change a map without having to redo all the assets, you don't even have to involve the artists to change geometry, you won't be able to screw up the lighting.

 

Procedural maps could be much different in a real time ray traced world, if things can be changed without so much being baked in.

 

I don't think we know the performance hit of real time ray tracing because no developer has a final release, or the release drivers, the BF5 demo apparently wasn't even using important features of Turing, partly because they were using the Titan V for development.

 

Price has far more to do with over supply of Pascal GPUs due to mining and lack of competition from AMD. I think people's negative attitude is mainly due to this, and it has nothing to do with the new technology in these cards.


Edited by AwesomeOcelot, 31 August 2018 - 03:13 PM.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users