Jump to content

AwesomeOcelot

Members
  • Posts

    1486
  • Joined

  • Last visited

Everything posted by AwesomeOcelot

  1. It started before it was even out. Nevermind that AAA games use 4-6GB now. People were panicing over not being able to run their gigantic veiny Skyrim mods. They need the 8K textures to get right up close to appreciate the detail.
  2. Since we're in a new generation of consoles with double the RAM, and potentially 12GB VRAM, we will start seeing far more games with higher textures. Also Direct Storage/RTX IO might change the way games are developed, high res textures streaming in and out, not so much pop in. I think the majority of developers didn't because 10GB plus graphics cards are probably around 2% of the market. If you take out LAN centres and laptops, it might be 5%. Almost all AAA developers had 2 other platforms to think about with 8GB shared memory, which usually meant max 6GB for VRAM. TechRadar noted that they couldn't get a 2080 Ti to maintain 60fps at 4K max settings, but the 3080 can. I don't tend to play games under 60fps.
  3. That looks like how games allocate VRAM. So 4.5GB is how much it's using, and 9.2GB is allocation size. The game only allocates 7.1GB if you're on a 2080 that only has 8GB. Death Stranding is not the only game that allocates less VRAM than what the GPU has available. TechPowerUp has VRAM usage at 4.9GB on a 2080 Ti.
  4. FFXV definitely doesn't need over 8GB to look the way it does, it's partly down to how the engine works, and also that it wasn't developed around such high textures. Some flight simulators also like to load a ton of textures into VRAM. Death Stranding is just a much better optimized game, it doesn't need more than 6GB. With any type of RAM, it's better to use all of it, but often it's a sign of poor development practice, if there's little visible benefit. I wouldn't be surprised if FFXV is patched to compensate for the 3080's 10GB as it is a Nvidia sponsored title, with no visual difference for the player. For a long time there's been mods, especially for Bethesda engine games, that use an incredible amount of VRAM. They don't look good compared to modern AAA games, but due to the way the game is optimized, the high resolution textures chew through VRAM. Modders don't have the development skills, or the access to the engine, to optimize their assets. Nvidia actually analyzed a lot of AAA games, including Metro Exodus, and found they only use 4-6GB VRAM. Hopefully there's also Direct Storage/RTX IO that will allow games to swap textures much faster, allowing games to use much higher textures with less VRAM. I hope they also develop more procedural and compression tech, because for me 65GB texture packs often aren't worth it for the marginal upgrade. The benefits of playing games on SSD out weighs higher textures. I have about 7TB of SSD storage, but it's not all dedicated to gaming, and I'm probably above average.
  5. I don't think Metro Exodus uses over 6GB VRAM on 1440 ultrawide, at least it doesn't in 4K extreme in benchmarks, it might allocate over 6GB VRAM, but that's usual because a lot of games allocate 1 or 2GB above what they use. Watch Dogs Legion comes with 3080, so when I finally get a Zen 3 and complete my built I'll find out whether it uses over 10GB, but I sincerely doubt it. Technically Doom Eternal is one of the few games that uses over 8GB, but people on 8GB GPUs wouldn't be able to tell you that, because it doesn't effect gameplay or fidelity. There will come a time when the top graphical settings in AAA games will be above 10GB or 11GB, but that time is probably '22/23. The reason for 10GB on the 3080 I'm assuming is because the memory is expensive.
  6. I never believed there would be a 20GB version of the 3080, so far you rarely need over 6GB for 4K, and the few games that go over 8GB it's so minimal with no real graphical benefits, devs could easily get it under 10GB. The 16GB 3070 being GDDR6 was possible, they've done 1/2GB, 2/4GB, 3/6GB versions of cards before, 10/11GB makes more sense in terms of use case.
  7. My LG CX is great as a monitor, it might be a lot cheaper than HDMI 2.1 monitors when they're released. How much 4K 120fps gaming is going to be happening on consoles? Not much I bet, and mostly in games where lower res will be fine. Last year I bought a Sony XF900 for a TV and I'm happy with it as a TV.
  8. Nothing says "we believe in this project" more than getting rid of all your writing staff. This looks like producers not liking the project, and the games being reworked to meet expectations. Publisher inteference has always worked out so well for games, at least they didn't just cancel the project entirely. Definitely waiting for extensive reviews before buying.
  9. G502 Hero arrived yesterday. I like the shape and they go on offer often enough that they're pretty cheap. 2004 MX510 2009 G5 2011 G500 2014 G502 2018 G502 2020 G502 Hero
  10. A 4000Mhz kit can be over twice as expensive as a 3200Mhz kit, the performance gain is definitely not worth it. For my kit, it's way over the difference between a 5800x and a 5600x, or a 5900x and a 5950x. When you're buying $700 MoBos and $300 kits of RAM, it's not about performance, it's more a sport a hobby, like people that tune their cars to go on track days.
  11. I have the 2014 version of ripjaws, they look hilarious. I've never been bothered about what the inside of my case looks like. I've only ever had windows on my case because they didn't do a windowless version. For the last 10 years my PC has lived under my desk. Just cut off a lot of plastic off the top of my SL600M to fit 200mm Noctua Chromax. Thinking about buying some polygon triangle vinyl and doing a Deus Ex Human Revolution pattern on the window.
  12. I've had good experience with Noctua and high end audio equipment companies that appreciate that people have paid a premium and they want customers to enjoy their products. I only went with EVGA because Corsair and Fractal was out of stock, Seasonic was really expensive. Last 2 PSUs have been Corsair and they've been faultless, they're selling the cable for half the price, which is still pretty expensive, but not insanity.
  13. Just had a look for a 12pin Nvidia cable to my EVGA PSU. EVGA PerFE 12 Cable: Only US, Out of Stock, $40. So Nvidia's free adapter it is. I may get a longer adapter if I can find one, but there doesn't seem any available right now. It's things like this that will make me never buy EVGA again. A lot of companies will give you free cable replacements with proof of purchase on their top end PSU.
  14. In 6 years time 12 cores will be to 8 cores, as 6 cores is to 4 cores today in gaming. In 6 years time, a 5900x is not going to beast a 5800x with a new GPU at 8K, like a 8700K doesn't beast a 7700K today with a 3080 at 4K. Unless something fundamentally shifts with game development, and my expectation for the mid-range and laptop CPU markets, suggests that won't happen. The 5600X is a top-range chip at $300, and that's 6 cores going into 2021, a lot more people are going to own that CPU than the 5900x.
  15. I can't see games requiring more than 8 cores 16 threads in 6 years. 5800x is a beast per core as well. It's also quite a bit more expensive than what I paid for my 5820K, and more than double what I paid for my 2500K. AMD didn't push the core count in 2020. There's also laptops to consider, which are technically the larger segment. There's also a fundamental limit to the benefit of more threads in work loads like games. Single thread work loads will always exist. Yes, games can and should be more multi-threaded, but there's always going to be diminishing returns.
  16. I got 6 years out of my 2500K and 5820K, one was cheap, one had poor per core performance. The CPU was never the bottleneck for my use case. When I was gaming at 1080p my FPS target was 144fps max, which I easily achieved. When I was gaming at 1440p my target was 120fps and the GPU was the bottleneck. Upgrading GPUs more frequently makes more sense, I went through HD6870, GTX970, GTX1080, RTX3080. I can't see a common gaming use case where someone with a 5800x is going to need an upgrade before 6 years.
  17. If someone's getting a Aorus Master, 5950X, and ProSiphon Elite I hope they're doing some hardcore overclocking. That's some exotic expensive hardware, I'd pay see it running, like an animal in a zoo. 750w should be enough for a regular system with a 3080. I'm always concerned about efficiency and aging though, so I always go at least 200w above what I need. I actually only went with 1000w because of the supply shortage of PSU. There was a few 850w/766w PSU I wanted to get instead. What is going on with that? I would have thought it would have been more than sorted by now.
  18. I was under the impression none of the Zen 3 CPUs had CCX equivalents since they had unified L3. The 5900x will have 2 CCDs, which is technically worse if it was Zen 2, but the new architecture makes it difficult to assume. In a game that's multi-threaded enough to take advantage of 8 cores, but not enough to take advantage of 12, the 5800x would beat the 5900x. There are games like this, that do poorly with 4 threads, great with 6, and make very little use of 8+, I think MS Flight Sim 2020 and GTA V have this type of behaviour, it's pretty rare. There's also the case that cores can be split into 2 threads, and whether any game actually currently or in the immediate future prefers 8 cores over 6 cores. I don't think there is a game where that would make a large difference. Also the elephant in the room that at 4K, getting both a game that is CPU bottlenecked and would benefit from 8, but not 6 or 12 cores, is a bit of a unicorn. We'll be lucky if we get a game that sees any appeciable difference between any of the announced Zen 3 processors in the real world. How many frames between the top Zen 2 and a 3600 at 1440p in most benchmarks?
  19. I pre ordered the IceGiant ProSiphon Elite on 21st March, and it wasn't going to arrive until January. I had 3 emails about delays, 3 shipping estimates, September, November, and December. ASUS are releasing a Crosshair VIII Dark Hero for the Zen 3 launch. Those boards are so expensive though. The Aorus Master is faultless, but also overkill. The 3080 FE can hit 368w without mods. I would look for a 1200w if you're going to push it. Although in real world loads it shouldn't top 900w, should be around ~800 max.
  20. I didn't bother to overclock, my 5820K wasn't a good overclocker. I just looked the problem up and apparently a Rampage V E died and degraded Buildzoid's 5930X. Bleeding edge is fun, but being on a more mature platform has its advantages. I was on a 2500K /w ASUS P8Z68-V PRO/GEN3, it was an amazing overclocker and the platform was very solid, the drivers were great. A Windows security update broke AI suite one time, error message, wouldn't run. It took ages for ASUS to post an update, and when they did it must have been from another branch for other MoBos because half the stuff doesn't work on it and it looks completely different. One of my ASUS routers bricked just after a year. My first ASUS external soundcard wouldn't start from cold, so if the power went out, you'd have to use a hair dryer or plug the USB data port into a 5V power to increase the resistance for it to turn on. The second one fixed that bug, but the drivers sometimes refuse to switch input or randomly switch to an unconnected input. Any loyalty I had for ASUS is long gone, I had a long string of perfect products for a few years.
  21. I used PCPartPicker and it didn't have my TV, so I forgot to add it afterwards Monitor: LG 48" CX OLED Not a lot of selection for X570 for reasonable prices with the features I wanted. I was looking at the ASUS TUF Gaming Plus but it won't flash the BIOS to support Zen 3 without a CPU. I like the feature set and software of Gigabyte, my GTX 1080 is Gigabyte, and this board got the OK from some reviewers. When I was last on an AMD platform from 2004, it was with a Gigabyte board, chipset by Nvidia. My last 3 MoBos have been ASUS, 2/3 graphics cards, 2 routers, and external soundcard. I'm on a ASUS RAMPAGE V EXTREME now. The BIOS support pissed me off, especially when Intel pushed the security patches, "**** 3 gen old boards even if they were really expensive", although last time I was on Gigabyte they didn't fix a major bug in their BIOS that Nvidia had pushed a fix to them because "**** 3 year old boards". I've bought ASRock and MSI too, not had an issue with them. ASUS stuff is nice, had some fatal flaws with their peripherals, it tends to look nice and have pretty software, but you pay the ASUS tax +5-10% compared to everyone else.
  22. I have ordered my RAM and MoBo, only thing left to do is order the CPU. CPU: AMD Ryzen 9 5900X CPU Cooler: Noctua NH-D15 82.5 CFM CPU Cooler Motherboard: Gigabyte X570 AORUS PRO WIFI ATX AM4 Motherboard Memory: Corsair Vengeance LPX Black 64GB 3600MHz DDR4 CAS 18-22-22-42 Storage: Samsung 980 Pro 250 GB M.2-2280 NVME Solid State Drive Storage: Gigabyte AORUS NVMe Gen4 1 TB M.2-2280 NVME Storage: Sabrent Rocket Q 2 TB M.2-2280 NVME Video Card: NVIDIA GeForce RTX 3080 10 GB Founders Edition Case: Cooler Master MasterCase SL600M Power Supply: EVGA SuperNOVA G1+ 1000 W Fans: Noctua NF-A20 PWM Chromax.Black.Swap X 2
  23. Pascal and RDNA 1 had raytracing. The only way that RDNA2 gets Turing performance without RT cores is if it's twice as fast, and even then people said RT performance on Turing without DLSS was not good enough. GPUs will get twice as fast as Turing eventually. It's not scepticism at this point, we've had the hardware for a while, tensor cores aren't even a Nvidia thing. For RT you'd be banking on an AMD software breakthrough that would be leaps beyond what Nvidia did with Turing. Nvidia's RT isn't just hardware, it's an incredible work of software that runs better on specializied hardware, it too a very long time to develop. We'll soon see, but looking at the consoles, it's not looking good for AMD, it doesn't look equivalent to Turing.
  24. I think it can get close to the 3080 in rasterization, but how important is that? I'm sceptical about the features, they did manage to catch up with VRR in 6 months but I don't think they've invested the same time and money as Nvidia has in AI and RT. It is not going to have RT or DLSS equivalents. Every Radeon I've been interested in for about 10 years has had availability issues, and that's how long it's been since I've bought an AMD card. If it's good, good luck trying to get one before March.
  25. Buildzoid talking about prices and overclocking on the 5000 series.
×
×
  • Create New...