Jump to content

Recommended Posts

On 1/31/2020 at 1:59 PM, ComradeMaster said:

 

You're also gonna need a high refresh rate, whilst there's no 32" 4k 100+hz monitor on the market yet

4k 100+ Hz monitors have been around for a short while, albeit with some caveats. The biggest hurdle is that the current existing HDMI and Display Port technologies don't have the bandwith to handle that much information at once. Several solutions already exist to address this problem until the day connections that can handle that type of bandwith hit the market:

1) Using multiple simultaneous connections. It's the simplest solution, albeit it initially came with all sorts of issues. Those have been mostly ironed out, but this solution does make certain technologies, like HDR, impossible currently.

2) Chroma Subsampling. Single connection and allows HDR and such, however it comes at the price of slight degradation of image quality.

3) Display Stream Compression. This is the newest and best solution. There is no loss in image quality and it operates on a single connection and doesn't prohibit HDR and such. The one drawback is minuscule input lag, but we're talking about something along the lines of 1/10th of a millisecond, which is imperceptible to even the most pro of pro gamers, let alone a schlub like me. As with any brand new technology, it's quite expensive right now. This 43" monitor is pretty much exactly what I would want (maybe a tad smaller, 38-40" ish would suit my needs perfectly, but I can live with 43"), but it's $1500. Hopefully by fall they have 4k DSC monitors that are a bit cheaper.

Edited by Keyrock

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

^ When medium/large 4k screens/tv's at 120hz are in the $500-800 range, that's when I'll care about high hz. It'll also be when I upgrade the gpu. Or build yet another PC if I must (hopefully won't have to tho, since I'm not a huge action or competitive gamer).

Ideally a 43"-46" "4k" as a center screen and two 30-32" 1440 on the sides. 

Until then, I'm still perfectly happy with 60hz monitors/TV. My eyes are just fine with it since I've never bothered with higher yet and I'm not going to upgrade (and then maybe grow unhappy with 60hz like how going bigger screen made small screens "unacceptable" to me...) until I'm sure I can do/achieve what I want.

Edit: I'm guessing it'll be the 40xx nvidia line  by then...

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to post
Share on other sites
  • 2 months later...

3080 Ti.

When they say 50%-70% faster than 2080 Ti I'm assuming they mean with RTX enabled but it's still a significant upgrade. 

 

c80f5c6f-bf18-42b6-afcf-2eec193fa476.jpg

08ff250f-7d81-4464-ba66-535c75266cd2.jpg

1ee4a56d-bb91-4fbc-a505-887781ea2ba1.jpg

eabb9e3a-ddf8-4e5a-bb05-caa6d064030f.jpg

  • Like 1

"America would be unrecognizable if it had ordered the separation of corporation and state like it orders separation of church and state."

Link to post
Share on other sites

To be clear, what was announced here was not gaming cards. Now, will Ampere gaming cards eventually be made and sold? Most likely, but it's uncertain when. I guess it depends on how competitive Big Navi winds up being. Right now, Nvidia has little motivation to put out Ampere gaming cards, since AMD has nothing that can compete in the high-end.

Edited by Keyrock

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

Even if AMD could compete on processing, Nvidia's software and features like DLSS 2.0 and RTX are so far ahead. AMD's only hope is that Nvidia decide to expand into more segments and forget about the PC space for 6 years, but what kind of company would... do... th... 🤭

The consoles were able to get so much out of mediocre hardware, AMD hardware, but that software effort wasn't put into PC games. Consoles had dynamic shaders and resolution, checkerboard scaling, and some pretty clever tricks that would have been really good as AMD features.

Link to post
Share on other sites

Yes I'm still boggled as to why PC gamers would go AMD, when their hardware is being used on both next gen consoles.  It just seems like a no-brainer from a consumerist perspective.

"America would be unrecognizable if it had ordered the separation of corporation and state like it orders separation of church and state."

Link to post
Share on other sites

Big Navi is confirmed to have HW ray tracing, as to what form it will take and how it will stack up against RTX... :shrugz:

As for DLSS, the implementation on the RTX 2xxx cards is pretty trash, but DLSS 2.0 is supposedly much better. No clue if AMD has an upscaling solution to counter. Honestly, I have doubts I'd make much, if any, use of DLSS, but it's cool tech.

I think the best case scenario for consumers is Big Navi releasing with the top card slightly beating 2080 Ti and at an attractive price point (similar story down the line to mid and low end segments). That would force Jensen's hand to release 3xxx sooner rather than later and keep the prices from being completely preposterous.

Edited by Keyrock

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

DLSS was always software. The hardware that it runs on is used for both DLSS 1.0, 1.9 (unofficial name for Control's implementation), and 2.0. There were statements about supercomputers and AI, but that was never implemented. Developers struggled to implement it on engines that were already well into development, like RTX ray tracing but even worse. At this point it's free TAA much better and actually boosts frame rates. It will probably get to a point where everybody should have some form of DLSS enabled, even if the base resolution is quite high, and the quality is highest.

Edited by AwesomeOcelot
  • Like 1
Link to post
Share on other sites
6 hours ago, Keyrock said:

As for DLSS, the implementation on the RTX 2xxx cards is pretty trash, but DLSS 2.0 is supposedly much better. No clue if AMD has an upscaling solution to counter. Honestly, I have doubts I'd make much, if any, use of DLSS, but it's cool tech.

DLSS is garbage, I never use it or Ray Tracing, though the memory bandwidth that uses RTX is supposed to be enhanced significantly for 3*** thereby improving FPS significantly so perhaps 144hz+ gamers can start to dabble in it then, w/o DLSS.

Any feature that forces you to lower a render resolution means you're doing it wrong.

Edited by ComradeMaster

"America would be unrecognizable if it had ordered the separation of corporation and state like it orders separation of church and state."

Link to post
Share on other sites
  • 1 month later...

Pulled this off the 'What is to come thread' since it probably fits better here*...

On 1/4/2020 at 3:16 PM, Zoraptor said:

Eh, pretty strong rumour is that nVidia's 7nm is not announced due to them committing to Samsung 7nm instead of TSMC and Samsung's 7nm process being about as broken as Intel's 10nm- so nVidia cannot get top tier chips off it, TSMC's 7nm is by all accounts fully booked until their next plant comes online and Apple finishes up their orders, so even if nVidia wanted to do a 7nm release they cannot. And given Jensen's abysmal interpersonal skills who knows if he's burnt the bridges to TSMC along with the bridges to Intel, MS and Sony. Not that it really matters so far as nVidia is concerned, at the moment.

That situation is getting closer to confirmed now- consumer 3000 series very likely to be on Samsung '8nm', only pro and above level cards at TSMC. Ultimate source is kopite7kimi on twitter, who is usually reliable, and there are a fair few supporting articles out now as well (eg Igorslab). Power draws may be a near throwback to Thermi as a result, which I guess will at least give a good reason to buy 1000W+ PSUs again.

*That first quoted sentence has more run ons than a rugby match in Dunedin during O week.

  • Like 1
Link to post
Share on other sites
  • 1 month later...

There is a countdown posted on Nvidia's Twitter account and, with a bit of math, it ends on Aug 31. Nothing confirmed yet, but this very likely means we'll be getting the announcement of the imminent launch of the RTX 3000 cards (or whatever numbers they use) from leather jacket enthusiast Jensen Huang on either Aug 31 or Sep 1.

Ampere has been one of the most, if not THE most "leaked" Team Green product lines, to the point that I suspect Nvidia may have been purposely "leaking" information themselves, in some manner of pseudo-clandestine marketing scheme. In any case, the "leaks" have been repeatedly pointing to a September launch, so this all lines up perfectly.

Edited by Keyrock

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

Yes the word "leaked" doesn't mean anything anymore, it didn't take companies long to see the market benefits of "leaking" information.

6 hours ago, Keyrock said:

There is a countdown posted on Nvidia's Twitter account and, with a bit of math, it ends on Aug 31. Nothing confirmed yet, but this very likely means we'll be getting the announcement of the imminent launch of the RTX 3000 cards (or whatever numbers they use) from leather jacket enthusiast Jensen Huang on either Aug 31 or Sep 1.

Ampere has been one of the most, if not THE most "leaked" Team Green product lines, to the point that I suspect Nvidia may have been purposely "leaking" information themselves, in some manner of pseudo-clandestine marketing scheme. In any case, the "leaks" have been repeatedly pointing to a September launch, so this all lines up perfectly.

 

"America would be unrecognizable if it had ordered the separation of corporation and state like it orders separation of church and state."

Link to post
Share on other sites

So, theoretically this is the RTX 3090 Founder's Edition:

NVIDIA-GeForce-RTX-3090-Ampere-Flagship-

NVIDIA-GeForce-RTX-3090-Ampere-Flagship-

Holy ****balls, look at the size of that monstrosity! Good luck fitting that in a mini-ITX. :lol:

The price tag floating around the rumor mill is a STAGGERING $1400. Granted, it supposedly packs 5248 CUDA cores, which is a signigicant increase from the 2080 Ti which it is replacing (?). Still, I dearly hope either that price is wrong or Big Navi is surprisingly good when it arrives (preferably both), if for no other reason than to knock Nvidia's prices down. I have no doubt that this card is a beast, but that price tag is painful to look at and it follows that the 3080, 3070, and 3060 will have similarly inflated prices. 

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

At first my brain couldn't process how that fan would even move useful airflow. Then I saw the triple slot bracket.

That rumoured price point is over four times the highest price I've ever paid for a video card. I'm resigned to the fact that I'm going to exceed that figure with the next one I buy, but I figure maybe by about 50%, not 400%. :ninja:

L I E S T R O N G
L I V E W R O N G

Link to post
Share on other sites

The 2190* is meant to be a Titan replacement rather than a 2080 replacement, so the size difference is at least a bit misleading. Also, 24GB of RAM and 350+W, allegedly.

*Betting that all nVidia's 21 years of geforce guff is leading up to yet another confusing and inconsistent renaming scheme.

  • Like 1
Link to post
Share on other sites
7 hours ago, Keyrock said:

So, theoretically this is the RTX 3090 Founder's Edition:

 

 

Holy ****balls, look at the size of that monstrosity! Good luck fitting that in a mini-ITX. :lol:

The price tag floating around the rumor mill is a STAGGERING $1400. Granted, it supposedly packs 5248 CUDA cores, which is a signigicant increase from the 2080 Ti which it is replacing (?). Still, I dearly hope either that price is wrong or Big Navi is surprisingly good when it arrives (preferably both), if for no other reason than to knock Nvidia's prices down. I have no doubt that this card is a beast, but that price tag is painful to look at and it follows that the 3080, 3070, and 3060 will have similarly inflated prices. 

I gave up hope on Big Navi being worth anything once I realized they're waiting for nVidias move so they could set their prices afterwards, I doubt they'll stand a chance when it comes to performance.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites
2 hours ago, Azdeus said:

I gave up hope on Big Navi being worth anything once I realized they're waiting for nVidias move so they could set their prices afterwards, I doubt they'll stand a chance when it comes to performance.

I highly doubt Big Navi will have anything to challenge the 3090 or whatever it will be called. I'm hoping they have worthy challengers to the 3080, 70 that are competitively priced. I have no intention of spending $1400 on a video card.

I think we'll be waiting for at least RDNA3 for AMD to compete at the highest end again. I'd be quite happy to be wrong.

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites
38 minutes ago, Keyrock said:

I highly doubt Big Navi will have anything to challenge the 3090 or whatever it will be called. I'm hoping they have worthy challengers to the 3080, 70 that are competitively priced. I have no intention of spending $1400 on a video card.

I think we'll be waiting for at least RDNA3 for AMD to compete at the highest end again. I'd be quite happy to be wrong.

The thing is, if they actually had anything really competitive to show, they'd have shown it by now I think. I doubt they, yet again, have anything to challenge the "3080" equivalent properly. The only thing I'm fairly confident about when it comes to their new cards is that I doubt they will need a proprietary ****ing molex connector. 🤬

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites

I think AMD have a pretty decent chance of being very competitive this gen, at least if the rumours are true.

1) node wise nVidia has a bit of a sidegrade for the consumer line, and the top cards are supposed to be the highest wattage consumer cards ever. The node improvement is always expressed as a performance gain at the same power draw, or an efficiency gain at same performance. In this case performance may be up, but so is power.

2) in contrast the nextbox specs have its total system wattage less than a 5700XT despite the extra 12 CUs plus an 8 core CPU and other components being included; and with comparable sustained frequency. That is, at least, a big gain in efficiency; which can presumably be traded for more performance in a desktop card.

3) an RDNA1 Big Navi would have been very competitive with Turing. 5700XT was only a mid range 40CU offering after all, while 80CUs would not double performance a 2080Ti was only ~50% faster. An 80CU Big Nav1 could realistically have been faster than a 2080Ti  (with a 350+W power draw...)

4) bringing the titan down into the consumer line from prosumer suggests nVidia think they need an overkill option

5) most of the concrete metrics for ampere I've seen are specifically for pro cards whose chips were fabbed at TSMC, not Samsung.

6) if the timelines are right nVidia is releasing consumer ampere near literally next week while AMD still has a couple of months to go (after the consoles' release, one suspects, since MSony funded RDNA2). A bit of a difference in hype and leakage levels is to be expected, and AMD probably wants to draw a bit of a line under the Raja-esque overpromising.

Link to post
Share on other sites
31 minutes ago, Zoraptor said:

I think AMD have a pretty decent chance of being very competitive this gen, at least if the rumours are true.

1) node wise nVidia has a bit of a sidegrade for the consumer line, and the top cards are supposed to be the highest wattage consumer cards ever. The node improvement is always expressed as a performance gain at the same power draw, or an efficiency gain at same performance. In this case performance may be up, but so is power.

2) in contrast the nextbox specs have its total system wattage less than a 5700XT despite the extra 12 CUs plus an 8 core CPU and other components being included; and with comparable sustained frequency. That is, at least, a big gain in efficiency; which can presumably be traded for more performance in a desktop card.

3) an RDNA1 Big Navi would have been very competitive with Turing. 5700XT was only a mid range 40CU offering after all, while 80CUs would not double performance a 2080Ti was only ~50% faster. An 80CU Big Nav1 could realistically have been faster than a 2080Ti  (with a 350+W power draw...)

4) bringing the titan down into the consumer line from prosumer suggests nVidia think they need an overkill option

5) most of the concrete metrics for ampere I've seen are specifically for pro cards whose chips were fabbed at TSMC, not Samsung.

6) if the timelines are right nVidia is releasing consumer ampere near literally next week while AMD still has a couple of months to go (after the consoles' release, one suspects, since MSony funded RDNA2). A bit of a difference in hype and leakage levels is to be expected, and AMD probably wants to draw a bit of a line under the Raja-esque overpromising.

0; I really hate the new quote functions, it bugs out for me all the time, I'd prefer to separate it all, but thank you for preemptively using sequenced numbers 😄

1; Yeah, they'll be powerhogs hence the proprietary Molex people has talked about, but do enthusiasts really care? I know I don't really care, unless there's an equally fast card at less power.

2: I don't know how true it is, but I've seen people claim that the nextbox gpu isn't the same gpu as RDNA2 that'll end up on desktop, but a yet slightly more advanced version than what we are getting. I don't really believe it, but at the same time, AMD has no reason to give a **** about desktop since their mindshare is pretty abysmal compared to nVidia. Anyway, the same people has been saying that the chips we're getting aren't going to be as good, but we'll see.

3: Yet they didn't go there, and that's what they can do with a much more advanced node, not really impressive. And while 80CUs would perhaps have more raw performance, if you say count TFLOPS, that doesn't really necessarilly translate to good behavior in games. My Vega 64 is about 10% slower than a 2080ti when it comes to TFLOPS, but when it comes to gaming the 2080ti demolishes the Vega, for several reasons, but among them the higher clockspeed. I think Sony is onto something there with clocking the living **** out of their RDNA2 chips and having fewer of them.

4: Might be right, I see it more like them wanting to punt AMD in the gentlemans parts to show them whos boss, I doubt they're even remotely worried about the competitions that AMD can put up. I think it's more likely that they are trying some strategy to be able to increase prices yet more.

5: This I honestly haven't kept enough tabs on to know anything.

6: There is a difference between hype and leakage levels, true, but beyond Lisa saying we will see RDNA2 in 2020 we don't have any official facts about it. I don't doubt they will lose alot of sales to nVidia just based on the fact that nVidia will have several months of people having access to them, while knowing absolutely nothing about RDNA2.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites

The fact that Lisa Su hasn't been hyping Big Navi doesn't worry me, since AMD is in bed with, as @Zoraptor put it, MSony, and wouldn't want to step on the console launches. Leather jacket enthusiast Jensen Huang, on the other hand, would like nothing more than to take a big steaming **** all over the launches of the PS5 and xXxBOX Series xXx.

I'm not too worried about Big Navi having something to counter the 3090. It would be nice, but I'm not shopping in that segment of the market. Ideally, a 3080 would be ~10% faster than a 2080 Ti and a 3070 nearly on par with it. If Jensen gets to set the price with basically no competition, then I fear we may be looking at something like $900 and $700 respectively. Hopefully AMD is competitive enough to drive those prices down to where whichever team color I choose, they at least have the common decency to apply a bit of lube before they violate my wallet.

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

The fact that they are in bed with Msony is way worries me, consoles has entirely different goals for their gpus than what we have for desktops. I don't have a power/heat constraint like consoles do. I don't want a nVidia gpu, because I really dislike the company and their policies, but I'll gladly hold Jensens hands so he can drop a big fat turd on consoles :p

The biggest problem isn't the 3090, I'd like a card to smash it and I'd pay for it aswell, but if AMD doesn't try to compete at all, well, the 3080 will be ludicrously expensive. Again.

A heaping handful of lube will be needed...

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites

By and large the difference between a console and desktop GPU is similar to that between a desktop and laptop version of a GPU- lower clocks, better efficiency; and with the proviso that the console 'laptop' has a highly customised SoC type set up. Even much maligned Vega is extremely efficient, when run in APUs or undervolted/ underclocked and may actually be the best selling AMD arch ever due to being used so long in APUs. The problem with Vega was that it was a dual purpose arch with professional/ compute/ science uses and emphasised raw compute power over effective gaming power. In the end that was the result of not having much money, and (correctly) betting on Zen.

The elephant in the room may well be texture compression. Much of nVidia's advantage comes from getting way better effective bandwidth from the same memory. If AMD has cracked that problem it may well be game on.

Quote

1; Yeah, they'll be powerhogs hence the proprietary Molex people has talked about, but do enthusiasts really care? I know I don't really care, unless there's an equally fast card at less power.

People will care if they're running a 250W Intel CPU and 350+W nVidia. 'Money no object' people won't care, but they never care. My Vega64 will run fine on a 550W PSU even with power limits off, but then I also have a 65W CPU only. A fair few people will not be at all used to the old days of having to meticulously check not just wattage but 12V amperage and connector numbers.

On the more meta level, I suspect a lot of people who are shocked and appalled at AMD's energy inefficiency will suddenly be of the opinion that power draw is irrelevant if nVidia really does go full Thermi this gen.

Quote

2: I don't know how true it is, but I've seen people claim that the nextbox gpu isn't the same gpu as RDNA2 that'll end up on desktop, but a yet slightly more advanced version than what we are getting.

No doubt it will be different, same as current gen console GPUs weren't literally rebadged 78xx and 570. They're not going to have the same memory configuration and direct access to the CPU, but then they never have. I'd be skeptical of them being architecturally nerfed though.

Quote

3: Yet they didn't go there, and that's what they can do with a much more advanced node, not really impressive.

I'd agree if you said that about RadeonVII, but...

The corollary being that with a switch to a more advanced, albeit not as much more advanced, node nVidia is claiming ~50% improvement (OK, more for RTX to be fair), but at ~50% extra power draw. On the power/ performance balance that is no improvement at all except for tensor stuff. In contrast 5700XT is just below 1080Ti/ 2070S level, but Vega64 was ~1080 level; plus 5700XT has 40CU vs 64 and is 180W vs 210W. That's a big improvement CU/CU (more than 60%) and decent (~15%) improvement is efficiency as well.

The big question being why there wasn't at least a 60CU Navi1 card- probably capacity issues at TSMC, and unlike with the RadeonVII Navi Instinct cards are going to be on a different arch with CDNA, so there aren't/ won't be non certified versions to sell to consumers.

(For everything: open question as to how it plays out on release and how accurate the leaks are, of course)

  • Thanks 1
Link to post
Share on other sites

Yeah, one of the problems with nVidias offerings is the RTX bit, it's kind of hard to factor that in when comparing to someone that doesn't have it. I wonder how much extra power it draws.

nVidia doesn't only have better texture compression than AMD, but DLSS 2.0 is bound to help alot aswell.

I don't doubt that there was no Big Navi1 because of capacity, especially when you factor in yield. Going from 251 to 380ish mm2 would affect it quite a bit.

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Link to post
Share on other sites

Supposedly, xXxBOX Series xXx has hardware ray tracing and some manner of super sampling. It stands to reason that PS5 and Big Navi will feature these also, given that they are all RDNA2.

rowsdower_sig.jpg.0f13980282a9229af0f1609eb6dee060.jpg
I wonder if there is beer on the sun

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...