My outrage... outrage is probably too strong a word, distaste is a better word... My distaste for Nvidia RTX has little to do with RTX itself. I mean I do a bit of an eyeroll when people, Nvidia and others alike, try to pass RTX off as actually just ray tracing. Just to be clear, RTX is not actually ray tracing, in the purest sense, it's a hybrid of ray tracing and rasterization. So there is that bit of stretching the truth, but that doesn't really put me off that much because the technology, hybrid as it may be, is still quite impressive in that it is able to produce results very similar to pure ray tracing at a fraction of the computing power.
What puts me off are the prices, they are just ridiculous, new technology or not. Realistically, you can expect very few games in the next couple of years that take advantage of RTX and fewer still that are playable at even just 1080p at max settings and RTX on with any single card, so for most people the whole RTX thing is kind of a non-issue, what matters to most people are performance gains over Pascal and we just don't know yet. Nvidia themselves will tell you things like 6X the performance and 35% to 125% gains over Pascal, but I never trust the manufacturer themselves (whether it's Nvidia, AMD, or Intel) to give me realistic numbers because they usually cite fringe results and benchmarks things in extremely specific environments that will yield the most favorable results possible, and these often do not mirror typical real world situations.
The bottom line is that we just don't know yet how these cards perform. We can speculate from core counts, clock speeds, and memory bandwith, but until we get reliable independent benchmarks, we just don't know. It's going to take a pretty huge leap in performance over Pascal to justify the price tags in my eyes.
Im pretty turned off by this as well, and will be passing on this Generation of Cards. 1. because I just recently purchased a 1080ti, 2. I dont care about RayTracing Hype. I want my games to be stable and not fps drop down to 10-20 fps erratically because a developer couldnt account for what effects would bottleneck a piece of hardware and when. I have issues with enough games today using experimental tech that end up performing terribly *Cough*Cough Pillars of Eternity II *Cough.
i've spent probably well over $4000 for my system, (Closer to $6k Including Monitors) so money isnt an issue for me, but there has to be a point where you call BS, and $1200 for that video card is "my" point.
Nvidia holds a conference/release party and Jensen spends an hour to an hour and a half giving people/press a borderline college course on RayTracing and the history of Computing Graphics. He has to paint this picture with that information because without it, him showing us the new overpriced cards, well, No one would give a flying bird butt about it and move on with their day.
So, New over hyped, Over Priced cards from NVIDIA are being released. They have this new lighting tech (that would require developers to actually implement in their games to be useful) <- meaning only the games they showed at the end of that "lecture" will have this "feature" How many of those games were you actually interested in? for me it was only Metro. And whos to say that the tech is actually implemented properly in that game? they demoed like 1 scene from each of the 3 games, and to be honest, it did not blow my mind. That being said it will take years for developers to make new games show casing this tech, it will probably be longer still for them to become proficient enough to implement the tech "properly"
And these cards also have this Deep Learning Core tech. He didnt really explain how this is implemented on the card. Sure he "lectured" on what Deep Learning is to PC graphics. But how does a Deep Learning Process cultivated by some super computer somewhere benefit the Card that I have installed in my PC? is there a seperate set of Drivers just for that chip on the card? will an Internet connection be required for this chip to get Deep Learning AI updateds from the Cloud or from NVIDIA servers? Will it be a once a month/week small update? or a large file update? or will the AI profile for a game require a constant streaming internet connection? Seems suspect.
Also, Jensen kept stressing that they need a new way to measure Performance because of this card, He said it several times throughout the lecture. Probably because if you took a standard game today (one that doesnt have some special experimental tech like raytracing and DLSS) this card would most likely not "perform" like a $1200 card.
In Summary, I think NVIDIA is just cash grabbing with this this card, because honestly they have no competition. I already feel like they cash grabbed a bit with the 1080 series by not mentioning how abysmal the SLI support is for them. really dissapointed in the state of gaming and gaming hardware at the moment, Hopefully im wrong...