Jump to content

AwesomeOcelot

Members
  • Posts

    1486
  • Joined

  • Last visited

Reputation

439 Excellent

About AwesomeOcelot

  • Rank
    (10) Necromancer
    (10) Necromancer

Badges

  • Pillars of Eternity Backer Badge
  • Pillars of Eternity Kickstarter Badge
  • Deadfire Backer Badge
  • Deadfire Fig Backer

Recent Profile Visitors

1820 profile views
  1. Yep I'm told the cheaper Oculus is great hardware for the price, but I really don't want to invest in something from Facebook. Of course Google isn't much better. Valve likes to abandon things as well, but there's enough content right now that my Index should be worth it.
  2. I must have waited over 6 months to buy a GPU when miners were buying them all up. I did not expect to have to wait over a month for a $549 CPU. I am about 400th in the queue with expected fulfilment 22nd December.
  3. If you don't want RT or DLSS then the 6900XT is your best bet for 4K rasterization but AMD will have to eventually follow Nvidia. Going forward native 4K is not as important as RT. Checkerboarding, dynamic resolution, variable rate shader, DLSS, RT quality are sacrificing native resolution where developers think you won't notice in gameplay. Since Volta, Nvidia has dedicated a lot of silicon and R&D to RT. There will also come a time when developers will have the rasterized version of the game for low end hardware, and the ray traced version that will look stunning. I wouldn't be surprised to see DLSS keep improving.
  4. Yeah, Linus said the Pro Siphon doesn't work better than a NH-D15 on an Intel, which I have for a 5900x is it arrives. Hopefully the Pro Siphon 2 will have copper and more condenser cores. It's not going to do much better on Ryzen unless you're stressing all cores like in Cinebench, that's what it's designed for, to be better than AIO in that situation. It's a shame it's loud, I hope they do a 140mm fan version.
  5. DLSS 2.0 is a straight upgrade over TAA, but does have some of the same bluriness. Native has aliasing, and somehow DLSS did manage to add more detail than native to some shots, especially foliage and some of the distant objects. It's definitely not straight forward anymore which one is "best", as native doesn't beat DLSS everywhere in image quality. Performance is where the argument is won, because in a lot of situations, you aren't running these games at the quality settings you want without DLSS.
  6. I stopped watching Linus and unsubscribed because of those thumbnails and titles. Linus: "AMD did not disappoint me". Also Linus: "I can't recommend them at these prices" vs RTX because streaming, RT, DLSS, etc...
  7. For me it comes down to what you're buying a high end GPU for and that certainly is AAA games from 2020 onwards. If you're in the market for a 3070+/6800+ you're going be playing games that have RT and DLSS. If you're not playing these games, then your use case becomes more niche, which is fine and having the 6000 series as an option is great for everyone. There's about 30 games with DLSS/RT now where the 3080 is far ahead, not really a choice. It coul easily be 3 times as many in the next 2 years. For most people, it can't even be a question, even if they're only going to be playing a quarter of the available RT games, 3080 is the much higher performance GPU. The 6800 XT isn't even that much faster than the 3080 in rasterization, the difference won't put any doubt into people's minds, being able to play pre-2020 games at higher frame rates is not going to rank high.
  8. 3080 destroys a 6800XT more than I even expected. It's not even a choice for anyone interested in single player AAA games released in the last year, and going into the future. There's a use case for multiplayer games, and people only interested in specific genres e.g. flight sims. Of course the supply issues are going to be even worse with AMD, Zen 3 and consoles are competing for the same fab capacity, and they will take priority.
  9. I managed to order a 5900x 6 minutes after release, even after a site crash kicked me out of checkout first time, but I'm just under 800 in the queue out of about 1600 people. Who knows when they'll get to me, it's meant to be not as bad as the 3080 launch, but it's going to take me longer to get one.
  10. I watched a video recently, maybe a week or 2 ago, where someone had to manually set it to get AMD/Nvidia GPU working with a Raspberry Pi on linux.
  11. It's funny how they had that just lying on the table. Is there any reason why they couldn't implement this on Turing?
  12. 2,698 Zen 3 orders over the stock they had on Scan, almost 60% of them being for 5900x.
  13. Technically we're in a period of cheap DRAM prices, if you're only looking for capacity, but high performance has always been ridiculous. A 3600 CL16 kit is a sweet spot, if you go higher expect to pay double the price for not a lot of performance uplift. DDR5 should be coming, and the transition period is always super expensive, but in 4 years prices should have come down again.
  14. Mega expensive to double capacity just to double the amount of sticks. Also 2 stick kits tend to be more expensive because the chips are denser, the demand is higher due to overclockers favouring them. I would also predict that demand and supply with change in the short term. 4 kits, especially on the high end, are rarer. I see now, the "stock" results are the 3200 CL14 x4. I should really stop skipping the intro, but I think their charts would be more readable if they just stuck with the same format on the labels.
  15. Theory now is that it's interleaving causing the increase in performance. They didn't show low latency RAM in the charts, only CL16/18. Will it make low latency a bit pointless? 4 stick kits tend to be cheaper, this might change the demand. It's probably has caught a lot of people out, as the advice was the opposite. Also overclockers prefer 2 sticks because you're more likely to win the silicon lottery. It definitely won't effect every situation, but 5% is ludicrous considering how expensive high bandwidth or low latency RAM is.
×
×
  • Create New...