Azdeus Posted October 27, 2020 Posted October 27, 2020 (edited) 23 minutes ago, Zoraptor said: Synthetics don't mean nothing*, but certainly have to be taken with a large grain of salt. You'd have to question whether a benchmark developed for- essentially- a nVidia branded technology is going to reflect reality or whether it has the RT equivalent of 64x tesselation applied. *The classic example is probably Zen2, where the increased cache completely invalidated some benchmarks because the benchmark would literally run from cache. Radeon 7 was a debadged/ non certified Instinct card, so it did incredibly well at benchmarks that relied on compute- or memory bandwidth since it had 4 stacks of HBM2/ ~1TB of bandwidth. Yeah, you're right. And in this case we supposedly have infinity cache that we can't account for and a dismal memory bandwidth instead. Edited October 27, 2020 by Azdeus Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken
LadyCrimson Posted October 27, 2020 Posted October 27, 2020 (edited) 2 hours ago, AwesomeOcelot said: 3070 reviews are out. A little less than what Nvidia suggested, but still close to the 2080 Ti at 4K, perhaps memory bandwidth limiting 1440p and under. RTX performance with a bit of an increase over the 2080 Ti, so it performs within a few percentage points of it in RTX enabled games, slightly under it in games without RTX. The 2080 Ti is factory overclocked as well. AIB 3070's are almost certainly going to slightly better overall than the 2080 Ti. Maybe the 4080ti will be able to hold 120 in 4k in even most AAA games of that future, and I'll finally get some higher hz monitors/TV's and see what the fuss is about (there was no way I was going to first get "used" to higher hz/fps if it still doesn't work for 4k, hah). Btw, all the "120hz" tv's ... are they really 120hz? Because tech specs may say it, but dig deeper and often are actually "native 60" with 120 being in some True Motion/Game Boost mode or something. I have no idea what that means in terms of tech, outside of the fact I can't stand things like True Motion on TV's... Edited October 27, 2020 by LadyCrimson “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Keyrock Posted October 27, 2020 Posted October 27, 2020 1 minute ago, LadyCrimson said: Btw, all the "120hz" tv's ... are they really 120hz? Because tech specs may say it, but dig deeper and often are actually "native 60" with 120 being in some True Motion/Game Boost mode or something. I have no idea what that means or how it effect stuff, outside of the fact I can't stand things like True Motion on TV's... Only HDMI 2.1 capable TVs can do 4K@120Hz without magic tricks.At that moment that means: LG C9 LG CX Sony X900H (XH90 in some regions) Samsung Q70T Samsung Q80T Samsung Q90T There may be more, but those are all of the ones I know of. 1 RFK Jr 2024 "Any organization created out of fear must create fear to survive." - Bill Hicks
AwesomeOcelot Posted October 27, 2020 Posted October 27, 2020 11 minutes ago, LadyCrimson said: Maybe the 4080ti will be able to hold 120 in 4k in even most AAA games of that future, and I'll finally get some higher hz monitors/TV's and see what the fuss is about (there was no way I was going to first get "used" to higher hz/fps if it still doesn't work for 4k, hah). Btw, all the "120hz" tv's ... are they really 120hz? Because tech specs may say it, but dig deeper and often are actually "native 60" with 120 being in some True Motion/Game Boost mode or something. I have no idea what that means in terms of tech, outside of the fact I can't stand things like True Motion on TV's... My target with my 3080 is going to be 4K 120hz with VRR/gsync. Whether that's with DLSS, dynamic resolution, or lowering quality settings. Digital Foundry has great videos on how to set up games to get max visual fidelity with least performance impact. Sometimes games settings take a lot of performance, but don't do much for visuals, especially in motion. TV's have repeating frames refresh rates. My father had a 360Hz plasma over 10 years ago. It could only display frames at 60hz. It was incredibly smooth for the time though. Response times, black to white, grey to grey, also factor in to how good a monitor is in motion. OLED and Plasma are a difference class in some regards, way better. To get refresh rates of 120hz at 4K TV's need HDMI 2.1, which my TV, the LG CX has.
LadyCrimson Posted October 27, 2020 Posted October 27, 2020 Ah, ok. So something like the Q90 has a 240 "magic trick" mode instead. Got it, thanks. “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
AwesomeOcelot Posted October 27, 2020 Posted October 27, 2020 (edited) 6 minutes ago, LadyCrimson said: Ah, ok. So something like the Q90 has a 240 "magic trick" mode instead. Got it, thanks. It's too complicated for me to explain because I'm not an expert, but the way video works and how you can do motion blur, means that while not "true" 240hz, it's still better than 120hz. In games, where frame to frame is not predictable in the same way, you don't want the TV to use 240hz, as it will actually harm the response. All the experts say to turn processing off on TVs for gaming, usually that's what the "game" modes do, so that response is increased. Edited October 27, 2020 by AwesomeOcelot
ComradeYellow Posted October 27, 2020 Posted October 27, 2020 Most experienced people in high fps gaming say that 144hz/fps is the sweet spot and those who target too extreme one way or another (60hz or 240+hz) are just foolz considering current hardware.
Keyrock Posted October 27, 2020 Posted October 27, 2020 You can do 1080p@240Hz natively with HDMI2.1, but with 4K you will definitely need magic tricks for 240Hz. RFK Jr 2024 "Any organization created out of fear must create fear to survive." - Bill Hicks
AwesomeOcelot Posted October 27, 2020 Posted October 27, 2020 Diminishing returns, it's the same with 720p vs 1080p vs 2160p vs 4320p, 60 hz vs 120hz vs 240hz vs 360hz, polygon counts, rays, anti-aliasing. My target refesh was 60/70hz for 20 years. Then it was 144hz, now it's 120hz. The jump from 60hz to 120hz is a lot bigger than 120hz to 240hz. The changes earlier on are way more noticeable, and going from budget to mid range is usually way bigger than from mid range to high end. Enthusiasts pay a premium for being earlier adopters and for low scale products. That's why my monitor is a TV, because while I would prefer a monitor, even though TV's are better now, monitors don't sell anywhere near as many, so they are way more expensive.
LadyCrimson Posted October 27, 2020 Posted October 27, 2020 (edited) 19 minutes ago, AwesomeOcelot said: It's too complicated for me to explain because I'm not an expert, No worries, even if you were, I wouldn't understand it. What Keyrock said is good enough, I trust y'all being more knowledgeable than myself. I'm still perfectly fine with 60hz as long as the 60fps is solid. But it's harder to find low hz quality monitors for example and I figure at some point I'm going to have to switch (yes I know I could run 120 at 60, just saying). I mean, I can *get* higher than 60 fps in games but outside of being an fps overhead I gather it doesn't mean anything vs 100+fps on an actual higher hz monitor. I'm dubious my old eyes would even notice a serious difference but one day I'll find out I guess. Edited October 27, 2020 by LadyCrimson “Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
AwesomeOcelot Posted October 27, 2020 Posted October 27, 2020 19 minutes ago, LadyCrimson said: No worries, even if you were, I wouldn't understand it. What Keyrock said is good enough, I trust y'all being more knowledgeable than myself. I'm still perfectly fine with 60hz as long as the 60fps is solid. But it's harder to find low hz quality monitors for example and I figure at some point I'm going to have to switch (yes I know I could run 120 at 60, just saying). I mean, I can *get* higher than 60 fps in games but outside of being an fps overhead I gather it doesn't mean anything vs 100+fps on an actual higher hz monitor. I'm dubious my old eyes would even notice a serious difference but one day I'll find out I guess. It depends what games and whether you have Vsync/VRR(GSync/Freesync). When I used to play competitive FPS I used to play at 300fps on a 70hz monitor. Having more FPS can effect the input response time e.g. from click to boom headshot, and display response time e.g. from click to seeing a flash. While the 60hz is only outputting frames at that rate, a higher frame rate can reduce lag if it hits before the frame is pushed. For the same reason you can get frame tearing, where 2 different frames are displayed at once. People might not be able to see the difference between 60hz and 120hz the same way they do 30hz and 60hz, but I think most people can "feel" the difference. They definitely feel the difference in VR .
Azdeus Posted October 27, 2020 Posted October 27, 2020 For those of you who are interested in a bit of insight into the 30-series supply, here's a webpage from proshop, a company that sells to scandinavia, germany, poland and austria https://www.proshop.se/RTX-30series-overview 1 Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken
AwesomeOcelot Posted October 27, 2020 Posted October 27, 2020 Pro shop. RTX 3080. Recieved = 534, customer orders = 3,786. Scan (UK). RTX 3080. Recieved = 302, customer orders = 3,393. Someone on reddit commented that Nvidia has said that the 3070 has 4 times the supply of the 3080. While the demand is obviously going to be higher for the 3070, at least more people will be able to get cards.
Zoraptor Posted October 27, 2020 Posted October 27, 2020 (edited) Here it's an estimated 6-8 weeks from order to receipt for 3080/90, and that's with a hefty additional cost above US MSRP/ currency conversion/ GST depressing demand. I suspect that's actually not bad at all, compared to some places. An analysis of the Ampere launch and what went wrong would be an interesting read. Lots of people blaming Samsung; but ultimately the fault has to lie with nVidia and most likely Jensen Huang himself. Not only because ultimately the TSMC/ Samsung decision was his either, there have also been other questionable decisions. Edited October 27, 2020 by Zoraptor 1
Keyrock Posted October 27, 2020 Posted October 27, 2020 In fairness to Nvidia and Samsung, this is far from a standard launch, given the pandemic. Component sales have been at holiday season levels since about April and have not let up. The demand this year is just at a ludicrous level and this is further exacerbated by shutdowns that temporarily halted production in many places. RFK Jr 2024 "Any organization created out of fear must create fear to survive." - Bill Hicks
Azdeus Posted October 27, 2020 Posted October 27, 2020 42 minutes ago, AwesomeOcelot said: Pro shop. RTX 3080. Recieved = 534, customer orders = 3,786. And 357 incoming cards Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken
Sarex Posted October 27, 2020 Posted October 27, 2020 (edited) 2 hours ago, ComradeMaster said: Most experienced people in high fps gaming say that 144hz/fps is the sweet spot and those who target too extreme one way or another (60hz or 240+hz) are just foolz considering current hardware. Most FPS pros I've seen are on 240hz/fps and hungry for more. Edited October 27, 2020 by Sarex "because they filled mommy with enough mythic power to become a demi-god" - KP
ComradeYellow Posted October 27, 2020 Posted October 27, 2020 36 minutes ago, Sarex said: Most FPS pros I've seen are on 240hz/fps and hungry for more. That's basically my point, clowns to left of me want super high resolution at the expense of fps and jokers to right want way too much hhz resolution be damned. 1
AwesomeOcelot Posted October 28, 2020 Posted October 28, 2020 360hz displays definitely show things that 240hz and 120hz don't, that's proven by high speed cameras. So pros will be interested in them. Although reaction wise, non-pros shouldn't need to worry to much about above 144hz, they're not making that reaction shot. Even non-pros can get an advantage by seeing a peak or someone run by a partially open door. If I was still playing competitive FPS I'd be looking at 240/360hz. For casual gamers, higher refresh rates just "feel" better, but 240/360 is a high frame target for a lot of games at reasonably enjoyable graphical settings. For pros, resolution is not a big issue in a lot of games. I remember playing at 800x600 in CS, which gave you an advantage. Pros are looking for pixel changes, and it's easier to see lower resolution. They are doing many things through memory anyway. For casuals, I'd argue the difference between 1440p and 2160p isn't great in motion, and things like ray tracing are worth stepping down for. Things like dynamic resolution and DLSS are making it clear that you don't need straight 4K, you only need segments of it in certain areas, situations. 1
Keyrock Posted October 29, 2020 Posted October 29, 2020 If you're Nvidia, when do you release the 3070 Ti SUPER and the 3080 SUPER? You know Jensen has those in the inside pocket of his leather jacket. Do you release them on 11/17 to **** all over the RX 6000 launch or is that too soon and you risk pissing off the people that already bough a 3070 or 3080? RFK Jr 2024 "Any organization created out of fear must create fear to survive." - Bill Hicks
ComradeYellow Posted October 29, 2020 Posted October 29, 2020 2 minutes ago, Keyrock said: If you're Nvidia, when do you release the 3070 Ti SUPER and the 3080 SUPER? You know Jensen has those in the inside pocket of his leather jacket. Do you release them on 11/17 to **** all over the RX 6000 launch or is that too soon and you risk pissing off the people that already bough a 3070 or 3080? This is a valid point. As someone "who's Nvidia" I guess I should be a Chad and wait until next year and see what they release. That's what I did with the 2080 Super so I guess I'll repeat that process. Most games run more than admirable with Turing still, especially with the latest drivers and DLSS upgrades. It's a win win really. If you wait, you get a 'Super', if they don't release a 'Super', you can get a better manufactured 3080 and perhaps cheaper on top it.
Zoraptor Posted October 29, 2020 Posted October 29, 2020 (edited) If I had to guess, 3070Ti (GA102) sometime early next year, 3080Ti... not sure we'll get one. A price drop on 3090 is more likely, imo, if there is a 3080Ti I'd suspect it would have the full bus and 12GB VRAM as its main selling point. Assuming, from the leasked slides, that Ti is the preferred naming scheme this gen. 3070Ti as a rescue bin for failed GA102 makes sense, especially if the rumours of yield problems are true. An extra 2GB of RAM and some speed over the 3070 give sales points, and that approach adds stock to the obviously depleted GA102 pool. A 3080Ti however would just spread that pool around even more shallowly and while there is a big gap in price from 3080-90 the performance gap is small. My suspicion is that SUPER cards, and if they arrive it would be around a years time, will be reserved for a node switch to TSMC for the GA102 based product line (with the GA102 Samsung line probably being retired). That's if they don't simply decide to rename the whole GA102 suite to 4000 or 3XX0 series if they switch nodes. Edited October 29, 2020 by Zoraptor
ComradeYellow Posted October 29, 2020 Posted October 29, 2020 17 minutes ago, Zoraptor said: If I had to guess, 3070Ti (GA102) sometime early next year Perfect release in conjunction with Rocket Lake, if true. 250W TBP (Superclocked Rocket Lake) + 250W TBP (Likely) for 3070TI has a nice ring, dunnit? And who knows when the Prosiphon Elite will arrive, considering it's constantly delayed.
AwesomeOcelot Posted October 29, 2020 Posted October 29, 2020 You release the SUPER/Ti versions in the summer. Nvidia probably know a lot more about AMD than the general public. RT took about a year to get going. It took 2 years for Nvidia to get DLSS right. VRS hasn't had a good implementation with the promised performance improvements. AMD spent 6 months catching up to Nvidia on VRR, they're not going to have these features up to Nvidia's level on launch. I don't think AMD will ever have a DLSS equivalent this generation, they don't have tensor cores, they haven't spent the years of development on it. Ray tracing on console games is very limited because the console hardware is mid range. There's 3 RT games on launch for AMD, the 6000 series is Turing level RT, and that's not going to look good against RTX. People are angry with Nvidia, they must have not tried to get a good AMD GPU in the last 10 years. When reviews come out, and a month after launch, the 6000 series is going to look expensive and supply limited. It's going to be like Turing where it looks slow and all the featues are "still in development".
Zoraptor Posted November 5, 2020 Posted November 5, 2020 Supposedly the 20GB 3080Ti is now reinstated as a forthcoming product. Rumoured, rumoured cancelled, now rumoured reinstated; it's Schrodinger's GPU and you'll only know whether it exists when it turns up. (Still say that a 12GB version with the full bus makes more sense, but I guess having more VRAM than RX6_00 is more important as a marketing point) 1 1
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now