Jump to content

Recommended Posts

Posted

I have utterly no idea what gpu thingies I had before a VooDoo3 outside of on-board and some no-name cards. I'm pretty sure I still have the VooDoo in a box in some closet. I still have a ge6200 in a drawer for some reason. 

We're such geeks.

 

I might've tried a single AMD once ages ago. Always been nvidia tho.  Hubby's had a few of the cheaper side AMD's and I could never stand their control panels, any version. Yes I'm that shallow.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

FUUUUUuuuuUUUU.....  I want it.   :sorcerer:
I put it at YouTube's "8k" on my 4k TV and oh man even with YT's lousy compression I was drooling. Can't imagine how good it must look in person. It even looked ok dropped to 1440.

I could sit one foot away from the screen! Should I do it? Hubby won't mind, right?  It's only money and you can't take it with you! Sigh ... well, maybe when the 4090 comes...

 

 

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted (edited)

The likelihood is that most people's 4K monitors look as good as that TV in terms of resolution, as the PPI is going to be roughly equivalent. It is very cool that a GPU can actually run 8K, and you can now game on a 80-120" TV without sacrificing resolution. 2klikphilip tried running 8K on a 3080, but the 10GB VRAM meant most new games broke. Definitely need 16GB min for 8K. Of course at 4K the 3090 is only 10-15% faster than a 3080 at double the price.

My Index came today, I haven't decided whether to test it out before I built my PC. I'm waiting for Zen 3 so I could be waiting until November. I am sure I'll wait for it before I play Half-Life Alex.

Edited by AwesomeOcelot
  • Like 1
Posted
42 minutes ago, AwesomeOcelot said:

The likelihood is that most people's 4K monitors look as good as that TV in terms of resolution, as the PPI is going to be roughly equivalent.

But I don't want to game (or watch video/TV for that matter) on a 27" or even 30" monitor. It's way too small. :*

To me these days, 4k PC monitors are pointless. I'm not into VR (yet) but I do like giant massive screens. I'm also one of those that likes to sit really really close to said screens, hence it has to ideally still look really good up close.  No sitting 10 feet away for me.

So it ain't the resolution so much ... it's the size I want. Super size and super clarity. Gimmie. ... but not for another 3-5 years I'm sure. Still not worth it today except as a tease....

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

My 48" LG OLED is almost the size of my desk already. A 100" TV would be as wide as the wall my desk is against. Who needs VR when you can have a holodeck?

I genuinely don't notice the difference most of the time between 1440p and 4K when gaming. I'm fine with dynamic resolution because in action, dropping resolution isn't going to make a difference for me. I'm more of a FPS snob. Every decade I double my FPS, and I literally cannot play on lower FPS then, I usually end up not finishing games and go back to high FPS games.

Posted

I've managed to get a Founder's Edition 3080, at least I got through checkout and have a confirmation email. All it took was over 100 attempts and F5 camping for just under a week, almost 160 hours.

  • Like 1
  • Gasp! 1
Posted

My time of arrival for my ASUS one has gone from 4th Nov, to 2nd, to Jan 2021. No stores take in inventory to physical ones so it's all digital reservations, get in queue basically.

  • Sad 1

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted

In my country stores don't even have it up for sale, let alone out of stock...

  • Sad 2

"because they filled mommy with enough mythic power to become a demi-god" - KP

Posted

Hopefully people get this mad buying frenzy out of their system and a RTX 3080 or Radeon 6900XT is actually available for me to buy in a couple months.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted (edited)

GN throwing shade on Nvidia's 8K marketing, as the performance isn't great. Not many games are actually playable, 1 out of their 6 game suite.

The EVGA 3090 in the UK is 2142 USD. Probably more than my entire build will be with a 3080. No one should buy this card for gaming.

Edited by AwesomeOcelot
  • Like 1
  • Thanks 1
  • Haha 1
Posted (edited)

At this point I still consider both 8K and ray tracing pipe dreams. My entire focus is 4K rasterization only performance. Ray tracing is doable, especially at reduced resolution/settings, but it's something I'll probably enable once for a half hour to admire how much shinier the shiny is, then promptly shut it off and never use it again, because I'd rather play games at 4K 100+ FPS at medium to high shiny than 40-50 FPS at maximum shiny. (How's that for a run-on sentence?)

To that end, the 3080 has the performance I'm looking for, but I want to see what AMD is bringing to the table. I don't expect the Radeon 6900XT, if that is indeed the name of the Big Navi flagship, to have as good ray tracing performance as the 3080, but I don't care about that. If it can match the 3080 in rasterization and undercut Nvidia's pricing, that would be the dream (lower power draw would be icing on the cake). I guess I'll find out in about a month.

Edited by Keyrock
  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted

I only went with 4K on my new TVs because they stopped making sub-4K ones. It'll be the same with monitors for me, and I don't expect that for another 5-10 years at least. Having to replace three screens instead of just one is a big disincentive for upgrading unless it's a proper generational upgrade, and that means moving away from LCD. ;)

L I E S T R O N G
L I V E W R O N G

Posted

Dang was holding out for the 3090, but the new reviews are a bit of a let down. DLSS 8K is not 8K resolution, its most likely 5K with AI sharpening. One benefit I see though is the 24gb VRAM being able to crank games to 16K+ without crashing, for large screenshots without stitching things together  

Posted

The 3080 will suit my 1440p needs and then some.  Actually, the biggest issue here isn't the card, it's the fact that some gaming engines currently on the market still suck at utilizing high refresh rates, the Unreal Engine comes to mind here.

  • Thanks 1
Posted
9 hours ago, Keyrock said:

At this point I still consider both 8K and ray tracing pipe dreams. My entire focus is 4K rasterization only performance. Ray tracing is doable, especially at reduced resolution/settings, but it's something I'll probably enable once for a half hour to admire how much shinier the shiny is, then promptly shut it off and never use it again, because I'd rather play games at 4K 100+ FPS at medium to high shiny than 40-50 FPS at maximum shiny. (How's that for a run-on sentence?)

To that end, the 3080 has the performance I'm looking for, but I want to see what AMD is bringing to the table. I don't expect the Radeon 6900XT, if that is indeed the name of the Big Navi flagship, to have as good ray tracing performance as the 3080, but I don't care about that. If it can match the 3080 in rasterization and undercut Nvidia's pricing, that would be the dream (lower power draw would be icing on the cake). I guess I'll find out in about a month.

The 3080 can pull 100fps in Metro Exodus with RTX enabled and maxed settings, I find that acceptable really. Control at 1440p, 55fps - acceptable with some adaptive sync really, with DLSS you get 90 average. DLSS really is remarkable in version 2.0, the image quality is quite good.

I'm interested to see what AMD can produce in Ray tracing performance, I don't expect anything, but if it can atleast match nVidias offering I will cancel my order or get rid of the card some way. Even if it doubles rasterization performance and comes with a free handy I'm not interested one bit.

 

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted (edited)

I can't see any way AMD will match RTX in raw raytracing performance, the approaches are too different. But the dedicated hardware that will give nVidia its performance advantage there also comes with its own costs and AMD's approach ought to be far more flexible.

11 hours ago, AwesomeOcelot said:

The EVGA 3090 in the UK is 2142 USD. Probably more than my entire build will be with a 3080. No one should buy this card for gaming.

3600NZD here for a Strix 3090 which is about 2200 USD- and our GST is lower than the UK's VAT too.

It really needed a bit more performance lift over the 3080 if they were going to call it a 3090. Expectations for a mainstream (albeit top end) card's price/ performance and segmentation are different from the expectations for a prosumer one like the Titan. Even the extra VRAM isn't that much of a selling point if there really is a forthcoming 20GB 3080.

Edited by Zoraptor
Posted (edited)
42 minutes ago, Zoraptor said:

I can't see any way AMD will match RTX in raw raytracing performance, the approaches are too different. But the dedicated hardware that will give nVidia its performance advantage there also comes with its own costs and AMD's approach ought to be far more flexible.

3600NZD here for a Strix 3090 which is about 2200 USD- and our GST is lower than the UK's VAT too.

It really needed a bit more performance lift over the 3080 if they were going to call it a 3090. Expectations for a mainstream (albeit top end) card's price/ performance and segmentation are different from the expectations for a prosumer one like the Titan. Even the extra VRAM isn't that much of a selling point if there really is a forthcoming 20GB 3080.

I don't think they will either, but I'm not going to entirely rule out the theoretical possibility. I'm very certain that I'm going to keep the 3080.

Cheapest EVGA 3090 I could find at the shop where I ordered the 3080 i 2250$/1805$ no VAT. Bit ludicrous.

Edited by Azdeus

Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, most of them imaginary. - H.L. Mencken

Posted

The thing with AMD here is that they have both their GPU's and CPU's inside both major consoles and will make a ton of money off that alone, so there's really no reason to expect anything grandiose from them in the PC GPU market.  I'm expecting decently priced med-high grade cards here for the midrange gamer and that's well within the context of AMD's history.

Posted

Don't know about a ton of money from the consoles since they make very little profit on a per unit basis there, though 20 million consoles would add up nicely. They certainly do get a lot of extra benefits from being the console supplier though, a lot of their research and development costs for RDNA is being paid for by MSony and it's no accident that the Terrascale -> GCN -> RDNA transitions happened around the console generation switches. And they can try to leverage their console standardisation into the windows sphere for things like generic DXR vs RTX.

Per last page I'm expecting the 6900XT to land pretty close to the 3080 in raster performance as I cannot see any reason for the power/ performance balance of the 3080 otherwise. I've seen a lot of people claiming they'll only compete with the 3070 but even a 60CU last gen 5800XT card would do that unless the scaling was truly awful. AMD has certainly done a good job of keeping the lid on leaks though.

(100CU Big Unit Gamma Cassiopeiae 6969XTX with 16 GB HBM2e and 400+W power draw... you heard it here first)

 

Posted (edited)
9 hours ago, Zoraptor said:

(100CU Big Unit Gamma Cassiopeiae 6969XTX with 16 GB HBM2e and 400+W power draw... you heard it here first)

 

For a workstation Titan-class card, I could see that. HBM2 for a gaming card is a mistake IMHO, a mistake AMD has already made previously and one Lisa Su would likely not repeat. HBM is just too prohibitively expensive for an increase in bandwith that's not meaningful enough (for gaming) at this point to justify the extra expense. For compute, on the other hand, HBM is great.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted (edited)
23 hours ago, Bokishi said:

Dang was holding out for the 3090, but the new reviews are a bit of a let down. DLSS 8K is not 8K resolution, its most likely 5K with AI sharpening. One benefit I see though is the 24gb VRAM being able to crank games to 16K+ without crashing, for large screenshots without stitching things together  

It is equivalent to 5K with AI upscaling. Which is fine, definitely an acceptable use case for a 8K TV as it's better than 5K, but it's not true 8K, so shouldn't be marketed as 8K. I actually think that true resolutions aren't that important, dynamic res, checkerboarding, and DLSS aren't going to be noticeably worse in a lot of scenarios of real world gaming. Digital Foundry talked about max res being a "canvas" and games don't have to strictly abide by it to get the best overall experience. Sacrificing res for post processing or frame rate when required.

I have my doubts that 80-100" screens are going to be an acceptable experience for PC gaming (i.e. desk with mouse and keyboard), 48" is on the edge of acceptable for me. Definitely not a good general PC experience, 48" is a bit too big, 32" is probably the ideal max, or ultra wide. As a console replacement (i.e. couch with gamepad) should be OK, but there aren't many games I'd choose to play that way personally.

Edited by AwesomeOcelot
Posted

^ I sit 2-2.5 feet from a 43"+ TV (on my desk) while gaming and I still think that's too small.  Not for watching movies (I sit further for movies), but gaming. If it was 80-100" of course I'd sit farther away but probably not even 5-6 feet. I like peripheral vision to sorta be "blocked out" as much as possible by the screen when gaming if that makes sense.

But I do use controllers a lot more these days if I want to sit further back. Or a cordless KB/mouse on a board across my lap. I don't play too many "twitchy" games that need precision/timing either.

Hubby has an expensive 55" 4k he's using for work (instead of 4 monitors like he was doing). It sits on his desk and when I sit in front of it for a while, it doesn't seem too big at all. :lol:

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

^Titans!  Yeah 27" 1440p 144-165hz is perfect for me.  It's more about smoothness than precision for me but yeah a lot of hardcore fps/sports games players seem to do dig super overly high refresh rates.  I consider myself somewhere in the middle, but with all this 8k business going around I'm starting to feel tinier by the day. :lol:

Posted (edited)

^ I remember when I thought a 4:3 CRT 19" monitor was huuuuge. 27" is a good desktop size for majority of people and it's a best all-around task size imo too.

"Smoothness" is something I may notice if I think about/look for it really hard, but not to the point where I obsess over it.  Doesn't bother my eyes/vision at all I guess, as long as it still feels smooth enough and isn't jerky or blurry to a point of constant noticing. etc. Hence why a solid 60fps is still just fine to me. I have enough "issue" with wanting clarity/resolution/size and monitor color accuracy, if I also "had to have" high hz/fps I'd go insane.

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...