Jump to content

AwesomeOcelot

Members
  • Posts

    1486
  • Joined

  • Last visited

Everything posted by AwesomeOcelot

  1. Yep I'm told the cheaper Oculus is great hardware for the price, but I really don't want to invest in something from Facebook. Of course Google isn't much better. Valve likes to abandon things as well, but there's enough content right now that my Index should be worth it.
  2. I must have waited over 6 months to buy a GPU when miners were buying them all up. I did not expect to have to wait over a month for a $549 CPU. I am about 400th in the queue with expected fulfilment 22nd December.
  3. If you don't want RT or DLSS then the 6900XT is your best bet for 4K rasterization but AMD will have to eventually follow Nvidia. Going forward native 4K is not as important as RT. Checkerboarding, dynamic resolution, variable rate shader, DLSS, RT quality are sacrificing native resolution where developers think you won't notice in gameplay. Since Volta, Nvidia has dedicated a lot of silicon and R&D to RT. There will also come a time when developers will have the rasterized version of the game for low end hardware, and the ray traced version that will look stunning. I wouldn't be surprised to see DLSS keep improving.
  4. Yeah, Linus said the Pro Siphon doesn't work better than a NH-D15 on an Intel, which I have for a 5900x is it arrives. Hopefully the Pro Siphon 2 will have copper and more condenser cores. It's not going to do much better on Ryzen unless you're stressing all cores like in Cinebench, that's what it's designed for, to be better than AIO in that situation. It's a shame it's loud, I hope they do a 140mm fan version.
  5. DLSS 2.0 is a straight upgrade over TAA, but does have some of the same bluriness. Native has aliasing, and somehow DLSS did manage to add more detail than native to some shots, especially foliage and some of the distant objects. It's definitely not straight forward anymore which one is "best", as native doesn't beat DLSS everywhere in image quality. Performance is where the argument is won, because in a lot of situations, you aren't running these games at the quality settings you want without DLSS.
  6. I stopped watching Linus and unsubscribed because of those thumbnails and titles. Linus: "AMD did not disappoint me". Also Linus: "I can't recommend them at these prices" vs RTX because streaming, RT, DLSS, etc...
  7. For me it comes down to what you're buying a high end GPU for and that certainly is AAA games from 2020 onwards. If you're in the market for a 3070+/6800+ you're going be playing games that have RT and DLSS. If you're not playing these games, then your use case becomes more niche, which is fine and having the 6000 series as an option is great for everyone. There's about 30 games with DLSS/RT now where the 3080 is far ahead, not really a choice. It coul easily be 3 times as many in the next 2 years. For most people, it can't even be a question, even if they're only going to be playing a quarter of the available RT games, 3080 is the much higher performance GPU. The 6800 XT isn't even that much faster than the 3080 in rasterization, the difference won't put any doubt into people's minds, being able to play pre-2020 games at higher frame rates is not going to rank high.
  8. 3080 destroys a 6800XT more than I even expected. It's not even a choice for anyone interested in single player AAA games released in the last year, and going into the future. There's a use case for multiplayer games, and people only interested in specific genres e.g. flight sims. Of course the supply issues are going to be even worse with AMD, Zen 3 and consoles are competing for the same fab capacity, and they will take priority.
  9. I managed to order a 5900x 6 minutes after release, even after a site crash kicked me out of checkout first time, but I'm just under 800 in the queue out of about 1600 people. Who knows when they'll get to me, it's meant to be not as bad as the 3080 launch, but it's going to take me longer to get one.
  10. I watched a video recently, maybe a week or 2 ago, where someone had to manually set it to get AMD/Nvidia GPU working with a Raspberry Pi on linux.
  11. It's funny how they had that just lying on the table. Is there any reason why they couldn't implement this on Turing?
  12. 2,698 Zen 3 orders over the stock they had on Scan, almost 60% of them being for 5900x.
  13. Technically we're in a period of cheap DRAM prices, if you're only looking for capacity, but high performance has always been ridiculous. A 3600 CL16 kit is a sweet spot, if you go higher expect to pay double the price for not a lot of performance uplift. DDR5 should be coming, and the transition period is always super expensive, but in 4 years prices should have come down again.
  14. Mega expensive to double capacity just to double the amount of sticks. Also 2 stick kits tend to be more expensive because the chips are denser, the demand is higher due to overclockers favouring them. I would also predict that demand and supply with change in the short term. 4 kits, especially on the high end, are rarer. I see now, the "stock" results are the 3200 CL14 x4. I should really stop skipping the intro, but I think their charts would be more readable if they just stuck with the same format on the labels.
  15. Theory now is that it's interleaving causing the increase in performance. They didn't show low latency RAM in the charts, only CL16/18. Will it make low latency a bit pointless? 4 stick kits tend to be cheaper, this might change the demand. It's probably has caught a lot of people out, as the advice was the opposite. Also overclockers prefer 2 sticks because you're more likely to win the silicon lottery. It definitely won't effect every situation, but 5% is ludicrous considering how expensive high bandwidth or low latency RAM is.
  16. I don't understand how there's so little YouTube content on Zen 3. I think this year might have broken YouTubers, what a schedule. GN has just released a video on a performance increase from only doubling RAM sticks. I remember when the advice was 2 sticks for dual channel, 4 sticks for quad channel. Not anymore apparently.
  17. I haven't seen a benchmark where the 5800x performs better because it's 1 ccd, at least one that overcomes the L3 advantage of the R 9s. Even its advantage in base and boost frequency against the 3 5600x doesn't justify the cost. GN said it was priced to upsell, as it's not as good price/performance as the 3 5600 in games, or the 3 5900x in productivity.
  18. Scan has sent an update that the ETA on my order is the 20th, suggesting that stock ran out within 1 minute. I expect this tbh, especially with the higher end chips. Hopefully I'm early enough on the pre order list to get it before December. Probably through the FX brand in 2007, but AMD's product line up and leads are so complicated during that time. Single core Athlon 64 beating early X2 in gaming. FX line beating X2 later on. How many sockets, architectures, and brands did AMD need from 2003-2008? I'm so glad they've simplified their line up with the Ryzen era, they even decided not to mess up the numbering again by skipping to 5000 numbering for Zen 3.
  19. That's my point. Core for core, I don't think Rocket Lake outright beats Zen 3. It's not a given that the fastest Rocket Lake with 8 cores beats the 5800x. Even if it does, does it beat the 5800x with SMT SAM? Less likely. Does it beat the 5800x on price with a $50 price drop? No. Does it beat the 5800xt? I doubt it. The problem here is 4 months gap. The reason Intel doesn't get hurt from the socket changes is that drop in CPU upgrades are not a thing. People don't upgrade their CPU every 1 or 2 years. They stretch it out to 4-6 years, and it doesn't hurt anyone, especially in gaming. There's enough reason for an entire platform change in 4 years. People don't actually use the features on the top end motherboards, there are much better options in the $150-300 range, so it's a non-issue. Even on AMD, you do not see 300 chipset Zen 2 builds, because the 400/500 chipsets are much better. The only people who care are people who pay $500-800 on a MoBo. Those people should either be rich so it doesn't matter, or should have changed their behaviour as the market changed, drop down a tier in pricing.
  20. Does it matter whether Rocket Lake can catch up with Zen 3? They're coming in March. The 8 core/6 core probably won't outright beat the 5800x/5600x in gaming. A Radeon card will allow SMT, which will probably tip the scales in favour of Zen 3 anyway.
  21. I got one and it was fairly easy, although the website did make me go through checkout twice.
  22. *Stroking 3080* What are you talking about?
  23. I might buy any zen 3 I can actually get to check out. I have a feeling demand is going to be high.
  24. Not sure how accurate this is but it's claimed the Chinese market don't buy iGPU laptops. Intel surely has some sort of plan and incite. The MX450 exists, and this segment has existed for a long time, so someone has to be buying these. This GPU can play Xbox One/PS4 generation games at above Xbox One/PS4 frame rates. The Nvidia shield and Nintendo Switch exists, the Switch has portable and first party as well, but still, people do play 3rd party ports on it.
  25. Intel's benchmarks put the Xe MAX closer to the MX450, with better Timespy but worse Firestrike scores, most games you'd want to play on this are DX11, so the MX450 has a good lead in most of them. Double the performance of NVENC on a 2080 Super. Apparently not for the western market, mostly for China. Dedicated LPDDR4X VRAM, 128-bit, 68Mb/s, that and cooling/power consumption is the reason why this isn't just integrated graphics, as this is being paired with the same architecture iGPU. I don't follow the budget market closely, only when I'm buying something, I was surprised to find AMD pretty much absent from this section of the market. You'd think consumers would pick the 450MX all day long unless they have no intention of gaming whatsoever, but then the use case for having a dedicated GPU is very slim. It's going to be a lot faster at video encoding than the competition but then who is going to be editing videos on a budget laptop? My only thought is that Intel is going to be aggressive with pricing and trust their superior brand recognition, OEM partnerships, to bully Nvidia out of this market.
×
×
  • Create New...