Jump to content

Build Thread 2.0


samm

Recommended Posts

If the fan noise on a GPU is bothering me too much, I just downclock the GPU so it doesn't get as hot and therefore need to push the fans so much. ...But uh, I don't usually need the GPU to work too hard anyways, and you just spent a few hundred dollars on a GPU, so you probably don't want to do that. :p

Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

I've grown to really like Nvidia's reference cooler (I think AMD uses more or less the same design for their reference coolers), I think I'm going to stick with it rather than getting a card with any sort of aftermarket cooler in the future (unless I put liquid cooling on a GPU myself).  While Nvidia's reference cooler doesn't push quite as much air as some of the dual or triple fan coolers, it pushes the hot air out of the back of the case instead of blowing it into (generally the bottom of) the case.  

 

I've never actually seen the fan on my Titan X go above 75% (I don't think it ever even hit above 70%), and that's during stress testing (Unigine Heaven) and overclocked at +100 MHz clockspeed and +500 MHz memory speed (I have no need to overclock the card currently nor do I run an overclock generally, I just did it to see how it would handle it during the stress test to see how much overclocking headroom I have for future considerations).  I know it's been said that in some cases a 980Ti can potentially be faster than a Titan X because it has greater overclocking thermal headroom, but either I won the silicon lottery, the Titan X has plenty of thermal headroom overall, or my case has really good airflow keeping the ambient temperature around the card low (likely a combination of the 3), because I easily have at least 200 MHz of clock speed and 600 or 700 MHz of memory speed I can push the card beyond stock speeds.  That's completely irrelevant right now because the Titan X easily handles anything at 1440p currently, but it's a nice bit of overclocking headroom to have in my back pocket, in case something comes along that needs it. 

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

What I probably should do is find a reference level of noise that I'm comfortable with and then compare that with the figures measured in various reviews. The Windforce cooler, along with the Sapphire Tri-X gets nothing but praise in reviews for being just about the quietest factory solutions available, quieter than even the best blower by far. So the "problem" is likely simply that I'm a lot more sensitive to noise than most people are, and always have been - I remembering going out to some lengths to find the famous Japanese Panaflo case fans more than a decade ago, back when video cards barely had a cooler worth mentioning.

 

AMD's reference blower is probably the worst cooler out there since the notorious nVidia FX5800. It's just that they've stopped shipping cards with it now: the Fury air coolers are exclusively custom vendor ones so the blower's last hurrah was with the 290 series release back in 2013. For what it's worth, nVidia's blowers are as good as blowers can get, probably, but with the 980Ti/Titan X it's really starting to show its limits: they do run noticably hotter and louder than factory custom coolers. However blowers will always have that niche of being the better solution for multi-GPU setups, because of the aforementioned exhaust effect. Recirculating air internally from one GPU is fine, the case fans can handle that more efficiently and silently than a blower.

 

Ultimately the problem lies with the ATX specification, which is an anachronism in this day and age but which persists due to inertia. CPUs sip less than 100W of power these days, but we have massive clearance for massive coolers and good airflow around them. Meanwhile video cards suck down two to three times as much power, but are placed in a cramped position with minimal airflow, and with the heatsinks upside-down. Hopeless these days but understandable in the context that when the ATX design was finalised, cards didn't even have a heatsink at all. I recall the original nVidia TNT was the first card I saw with a heatsink, and the TNT2 was the first I saw with a fan. Not sure when 3dfx started putting heatsinks on theirs, pretty sure the original Voodoo was just a bare chip.

Edited by Humanoid
  • Like 1

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Afraid not. :)

Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

If the fan noise on a GPU is bothering me too much, I just downclock the GPU so it doesn't get as hot and therefore need to push the fans so much. ...But uh, I don't usually need the GPU to work too hard anyways, and you just spent a few hundred dollars on a GPU, so you probably don't want to do that. :p

 

That is why the STRIX cards are great, when it's idling the fans don't spin.

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

Ultimately the problem lies with the ATX specification, which is an anachronism in this day and age but which persists due to inertia. CPUs sip less than 100W of power these days, but we have massive clearance for massive coolers and good airflow around them. Meanwhile video cards suck down two to three times as much power, but are placed in a cramped position with minimal airflow, and with the heatsinks upside-down. Hopeless these days but understandable in the context that when the ATX design was finalised, cards didn't even have a heatsink at all. I recall the original nVidia TNT was the first card I saw with a heatsink, and the TNT2 was the first I saw with a fan. Not sure when 3dfx started putting heatsinks on theirs, pretty sure the original Voodoo was just a bare chip.

Yeah, if I had an aftermarket GPU cooler it would force me into a less than ideal airflow configuration in my case where I would either have to have the hot air do a full lap around the case before exiting through the top and back or I'd have to completely reconfigure my setup to suck in through the top and back and blow out through the front and side.  I'd rather not do that since I have the 240mm radiator for my liquid cooler in the front (it's the only place in my case where I could fit and mount it without major surgery of the case itself) and I'd rather have fans sucking in cool air from the outside to cool the radiator that's in turn cooling the liquid cooling my CPU than having warm air from inside the case blowing out through the radiator.  All these issues could easily be avoided if the GPU just faced the other direction and blew the hot air toward the top of the case instead of toward the bottom.  Unfortunately the ATX configuration doesn't allow for that/they don't build GPUs to face that direction.  

 

With the blower on my GPU I avoid all those issues and have nice cool air coming in from the front cooling the radiator then getting sucked up and toward the back of the case where it exits, and I have the side vent blowing in cool air that gets sucked up around the GPU then out the top and back of the case.  I have a nice jetstream of air moving from the front and side of may case, merging in the middle and blowing out the top and back and no issues of airflows opposing each other.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

AMD's reference blower is probably the worst cooler out there since the notorious nVidia FX5800. It's just that they've stopped shipping cards with it now: the Fury air coolers are exclusively custom vendor ones so the blower's last hurrah was with the 290 series release back in 2013. For what it's worth, nVidia's blowers are as good as blowers can get, probably, but with the 980Ti/Titan X it's really starting to show its limits: they do run noticably hotter and louder than factory custom coolers. However blowers will always have that niche of being the better solution for multi-GPU setups, because of the aforementioned exhaust effect. Recirculating air internally from one GPU is fine, the case fans can handle that more efficiently and silently than a blower.

 

Ultimately the problem lies with the ATX specification, which is an anachronism in this day and age but which persists due to inertia. CPUs sip less than 100W of power these days, but we have massive clearance for massive coolers and good airflow around them. Meanwhile video cards suck down two to three times as much power, but are placed in a cramped position with minimal airflow, and with the heatsinks upside-down. Hopeless these days but understandable in the context that when the ATX design was finalised, cards didn't even have a heatsink at all. I recall the original nVidia TNT was the first card I saw with a heatsink, and the TNT2 was the first I saw with a fan. Not sure when 3dfx started putting heatsinks on theirs, pretty sure the original Voodoo was just a bare chip.

Agreed on the AMD-Ref-Coolers. Notable exceptions: 295x2 and the Fury X, as well as the lower tier cards. My trusty old 4770 with its humble reference cooler was really quiet :)

Regarding the ATX specifications: I really wonder why no vendor designs cards in "upside down" with the hot parts on top, as well as their coolers (with the only exceptions I know of being some rare passively cooled ones).

 

Silly question. Can 2d applications make use of superflous video ram, or is it just there to act as a buffer ensuring that as much data is pumped through the pci x16 bus as possible. 

 

 

In other words does it serve any purpose under low load. 

It does not serve any purpose under low load, as long as your monitor's resolution fits in the frame buffer. And that takes up a laughably small amout of memory. However, 2D applications as well as non-graphical applications can make use of the video ram, and the GPU's capabilities: GPU accelerated filters for movies, or Photoshop filters, or 2D games, or browser rendering, general GUI rendering in modern OS s* or password cracking software ;) and other so called "GPGPU" (general purpose graphics processing unit) applications. Often found in high end super computer simulation applications, but more and more in desktop applications as well.

 

*: how to correctly spell the plural of an acronym?

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

Humanoid, I got a 290x card as well. The noise don't bother me too much but I decided to screw around a bit with it anyways. Added one extra case fan as well as reducing the voltage on the card (could actually reduce it quite a bit without it becoming unstable in games) and that brought down the temperatures quite a bit. Meant that I could relax the fan curve a bit and now the noise is much more managable.

 

So yeah, screwing around with the voltage a bit might be worth a try for reducing temps a bit.

Listen to my home-made recordings (some original songs, some not): http://www.youtube.c...low=grid&view=0

Link to comment
Share on other sites

Cool news I can run GTAV at full graphic settings with my new video card ...it just does it  8)

"Abashed the devil stood and felt how awful goodness is and saw Virtue in her shape how lovely: and pined his loss”

John Milton 

"We don't stop playing because we grow old; we grow old because we stop playing.” -  George Bernard Shaw

"What counts in life is not the mere fact that we have lived. It is what difference we have made to the lives of others that will determine the significance of the life we lead" - Nelson Mandela

 

 

Link to comment
Share on other sites

I find that people have funny definitions of what "maximum settings" means when it comes to talking about their own rigs. Does "full graphic settings" include at least 8x, possibly 16x antialiasing*, no shortcut settings enabled in AMD's GPU control panel, etc.? :p

 

*...assuming GTA V even comes with antialiasing settings, which it wouldn't surprise me overtly if it didn't based on the type of game that it is, being open world with many complex, mobile objects whose movements are constantly being calculated

Edited by Bartimaeus
Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

Unfortunately it's a common marketing trick for the board vendors to add more VRAM to a GPU that can't really take advantage of it, to try to fool people into thinking more is better. The 5800-series came with 1GB VRAM standard and that was all they really needed, the 2GB version was largely pointless. Likewise the GTX960 comes with 2GB as standard, and excepting edge cases, the 2GB version will be as fast as the 4GB version. You're paying more because it's a non-standard version of the card, how much is the 2GB version?

 

The  2GB 960 GTXs is about 40$ cheaper than the same card at 4GB.  

Link to comment
Share on other sites

I'm going to go with 'OSs' having for some strange reason spent a few minutes looking it up. An 'OSS' is another acronym, so you only have the lower case as a distinction. You could use apostrophes 'OS's' but that might confuse some people.  Would that would make the posessive plural 'OS's'' ?

 

*resisting the urge go go further down the rabbit hole*

Na na  na na  na na  ...

greg358 from Darksouls 3 PVP is a CHEATER.

That is all.

 

Link to comment
Share on other sites

I find that people have funny definitions of what "maximum settings" means when it comes to talking about their own rigs. Does "full graphic settings" include at least 8x, possibly 16x antialiasing*, no shortcut settings enabled in AMD's GPU control panel, etc.? :p

 

*...assuming GTA V even comes with antialiasing settings, which it wouldn't surprise me overtly if it didn't based on the type of game that it is, being open world with many complex, mobile objects whose movements are constantly being calculated

 

So it means in GTAV context you enable everything in the GTAV graphic and advanced graphics settings within the game ..not the AMD control panel

"Abashed the devil stood and felt how awful goodness is and saw Virtue in her shape how lovely: and pined his loss”

John Milton 

"We don't stop playing because we grow old; we grow old because we stop playing.” -  George Bernard Shaw

"What counts in life is not the mere fact that we have lived. It is what difference we have made to the lives of others that will determine the significance of the life we lead" - Nelson Mandela

 

 

Link to comment
Share on other sites

I'm going to go with 'OSs' having for some strange reason spent a few minutes looking it up. An 'OSS' is another acronym, so you only have the lower case as a distinction. You could use apostrophes 'OS's' but that might confuse some people.  Would that would make the posessive plural 'OS's'' ?

 

*resisting the urge go go further down the rabbit hole*

 

OSes, like kisses.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

 

If the fan noise on a GPU is bothering me too much, I just downclock the GPU so it doesn't get as hot and therefore need to push the fans so much. ...But uh, I don't usually need the GPU to work too hard anyways, and you just spent a few hundred dollars on a GPU, so you probably don't want to do that. :p

 

That is why the STRIX cards are great, when it's idling the fans don't spin.

 

 

Realistically though, any decent video card will be essentially inaudible at idle, it's load noise that's the concern.

 

Humanoid, I got a 290x card as well. The noise don't bother me too much but I decided to screw around a bit with it anyways. Added one extra case fan as well as reducing the voltage on the card (could actually reduce it quite a bit without it becoming unstable in games) and that brought down the temperatures quite a bit. Meant that I could relax the fan curve a bit and now the noise is much more managable.

 

So yeah, screwing around with the voltage a bit might be worth a try for reducing temps a bit.

 

Yeah, undervolting is an option. Case is an Antec P182 configured for a somewhat lower heat dissipation, one intake one exhaust, middle drive bay removed and top exhaust mount is sealed with foam. That said, my Define R5 arrived with my video card, it's what I'll be using with my Skylake build so we'll see how much of a difference it makes with its padded panels. It's unfortunate that I don't like headphones ("condoms for the ears") because in theory they'd solve my problems instantly and for relatively cheap. (I've been eyeing a pair of Philips Fidelio X2s, my fourth attempt at finding headphones I like)

 

I didn't really need the performance except in The Witcher 3 so I have no real need to extract maximum performance out of my system. God that game's cost me several hundred dollars now - $100 for the game and expansions, another $200 for the CE (Xbone version because reasons), $400 for a new video card, and $2 for AA batteries for my gamepad.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

How about fitting a custom cooler. It's a bit of a pain to have to void the warranty straight out of the box, but I'll chose less noise every time. 

 

I wish they would sell them with a custom cooler already fitted, or maybe manufacturers should always be forced to include fan level and noise specs. I was considering one of these a while ago.

http://www.wacom.com/en-us/products/pen-displays/cintiq-companion-2

 

Despite costing as much as a new system give or take, it turns out that they only come with a standard small fast spinning CPU fan and no mobile GFX unit and are extremely noisy. Despite having such a slick high end design, It's for 'graphics professionals' apparently. If Apple were producing something like this you can bet they would have thought a bit more about the consumer experience before letting it pass quality assurance.

 

When ohh when is Wacom going to get some serious competition so ordinary mortals will be able to buy a device like this. 

Na na  na na  na na  ...

greg358 from Darksouls 3 PVP is a CHEATER.

That is all.

 

Link to comment
Share on other sites

As mentioned, an Arctic Cooling Accelero Xtreme 4 is an option, but at $80 a rather costly one. And the experience installing the Xtreme 7970 on my 7950 wasn't exactly smooth sailing, the thermal adhesive that came with it for the RAM were not very adhesive and the fans while very quiet, have proved somewhat unreliable. I'll consider it depending on what kind of improvement in the noise signature I can get from simply moving the card into my new system.

 

I've had aftermarket coolers installed on most of my cards over the last decade, the 7950, the 5850 before it (though only towards the latter half of its lifespan), the 8800GT, and on the 7900GT (which died). The only exception in that bunch is the X1950XT I had, which I only had for a few months, that was a terrible buy and the HIS IceQ cooler it came with started making terrible noises within that time.

 


 

The Skylake release has kind of crept up on me, I didn't realise the -K models were out within a couple of weeks, so I probably should start thinking about specific parts soon. Not sure which way I'm going with the RAM, I imagine most if not all launch boards will be for DDR4, but without meaningful performance improvements, keeping my DDR3 is definitely an attractive option. I'll port three of my SSDs over (the non-system ones) and probably try an M.2-SATA one for a new system drive. Tempting to just go ahead and buy a 1TB SSD though, something like a Crucial BX100 is around the $400 mark now, the same price I paid for a 120GB drive in 2010.

 

I already have the Define R4 case, EVGA Supernova G2 PSU and Scythe Mugen 4 cooler waiting, so all I really need to get is the CPU, motherboard, one or two SSDs, potentially some RAM, and maybe a Blu-ray drive just because. Will try a Logitech MX Anywhere 2 mouse instore when it's released here, and also eyeing a set of Philips Fidelio X2 headphones.

 

Since it's a clean install I also figure I may as well try Windows 10, as a free upgrade from the two installs I have left on my Windows 7 Family Pack, and if I don't like it it's not big problem to just start over.

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Realistically though, any decent video card will be essentially inaudible at idle, it's load noise that's the concern.

 

If you are building a quiet system that is not really true. Then again most people would go for a custom water loop then. Anyways the cards(strix) are pretty good even in load.

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

Cool news I can run GTAV at full graphic settings with my new video card ...it just does it  8)

Oh I can't max the graphic settings, there are some advanced graphic setting options that make my card go over the 4GB memory

 

But it still looks good 

"Abashed the devil stood and felt how awful goodness is and saw Virtue in her shape how lovely: and pined his loss”

John Milton 

"We don't stop playing because we grow old; we grow old because we stop playing.” -  George Bernard Shaw

"What counts in life is not the mere fact that we have lived. It is what difference we have made to the lives of others that will determine the significance of the life we lead" - Nelson Mandela

 

 

Link to comment
Share on other sites

Ran into an old problem where in idle states (AMD Powerplay, the card clocks down to 300MHz on the core and 150MHz on the memory, down from 1000/1250MHz respectively), the VRAM will clock down too low and cause corruption (and possible black-screening) in 2D applications like an Internet browser. Fortunately I'm familiar with the problem and the solution - using a third party program like ASUS GPU Tweak or MSI Afterburner to raise the 2D clocks - but it's frustrating that there's no such inbuilt functionality in AMD's software.

 

The problem apparently doesn't manifest on all systems even with the same card used - at least part of it is due to the arcane interactions with the motherboard as opposed to a specific fault with the card - so I'm hopeful that it's just a peculiarity with my 5-year old i5-750 platform and that it'll go away without having to use the workaround on my Skylake build.

 


 

And now for something completely different: I've still got an old Sandy Bridge system gathering dust on my shelf. I'll go try out Windows 10 on it now that it's released and there are no major issues being reported.

Edited by Humanoid

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...