Jump to content

Recommended Posts

Posted
6 hours ago, Wormerine said:

Is really this single monstrosity all I need to connect it?

Go for two separate cables instead, it'll be more stable that way. Having multiple connectors for video cards per cable is sort of a compromise for people who run multiple video cards, which isn't something that really happens much these days.

  • Thanks 1

L I E S T R O N G
L I V E W R O N G

Posted (edited)
22 hours ago, Wormerine said:

My graphic card has total 16 pins (8 + 8). Is really this single monstrosity all I need to connect it? It feels kinda odd to plug all four heads, while while a single 8header goes into the PSU.

My 2080ti has the single "cord" that (gpu end wise) separates into the two pin plugs, that go into the card.  They are both plugged into the card and other end into the PSU.  It works just fine/it's intended?
gpu-plugs.jpg

The old 980ti had the same thing except the secondary pin/plug was smaller/less pins at the time.  So I'm not sure what is meant by Humanoid's "go for two cables".  Going by your picture I'd expect that both pin-plugs would need to be plugged into the gpu (eg, there is not one left dangling unconnected).  But again, I may be misunderstanding/confused re: context here and per usual if that's the case, just ignore me....

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted (edited)
15 minutes ago, LadyCrimson said:

So I'm not sure what is meant by Humanoid's "go for two cables".

Well, apparently you can use two cables (connect each 8 GPU pin to it's own PCIe power slot in the PSU) to provide it with enough power, especially with overclocking. That's what I did today following @Humanoid's advice.

xd7Ec6J.jpg

(it's super ugly with those space headers dangling, but I will take care of it tomorrow once I plug in remaining drives and tidy up come cables. A nice custom 16 pin cable does look very enticing right now.

That said:

pDODSSx.jpg

Behold! It works. 

The graphic card though... really?

 

Edited by Wormerine
  • Like 2
Posted
6 minutes ago, Wormerine said:

Well, apparently you can use two cables (connect each 8 GPU pin to it's own PCIe power slot in the PSU) to provide it with enough power, especially with overclocking.

Ah, ok.  I never bother with stuff like that and just use what the package gives me.  Always works fine so I don't think about it outside of that (and I don't even know how/ever tried to manually OC).

That animated thing - that's on the card? How ridiculous.  If the software has any way to turn that off (like you can turn off the RGB), I'd do it.  But I suppose some may like the bling/find it funny etc.  Whatever.  :lol:

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted
58 minutes ago, LadyCrimson said:

That animated thing - that's on the card? How ridiculous.  If the software has any way to turn that off (like you can turn off the RGB), I'd do it.

Yup, it's own LCD screen. You have this useless animated thing, but you can also set it up to show GPU info (tremperature, load, power draw, frequency etc) which I currently have it on. You can also upload your own picture of gif. VANITY!

Yeah, I will be turning it all of, especially as the case will be on my left side. Otherwise, I might have kept the GPU info on.

  • Haha 1
Posted
7 hours ago, kirottu said:

Can you play Doom on the LCD screen?

Uuuuu.... new Doom is on my "to play" list. Maybe I can run Doom remake on one screen, and original Doom on the graphic's card.

Posted
1 hour ago, Wormerine said:

Uuuuu.... new Doom is on my "to play" list. Maybe I can run Doom remake on one screen, and original Doom on the graphic's card.

If you do them both at the same time, I'm pretty sure that qualifies as a threesome.

  • Gasp! 1

This post is not to be enjoyed, discussed, or referenced on company time.

Posted
On 1/14/2021 at 10:52 PM, LadyCrimson said:

My 2080ti has the single "cord" that (gpu end wise) separates into the two pin plugs, that go into the card.  They are both plugged into the card and other end into the PSU.  It works just fine/it's intended?

Just came over this video (ha! YouTube will probably be recommending me PC building vid for a while!). In case time stamp won’t work it’s: 17:40. Seems like single cable will most likely be ok, but it will draw more power then it was build for.

 

  • Like 1
Posted (edited)

I dunno ... I don't see how that benefits an average schmuck like me who doesn't OC or anything like that. I doubt most such who are like me have ever used two separate cables on any gpu and they've all "been just fine for 5-10 years."  eg, the life of using the thing.  It might be the most "optimal" in an "OCD" tech kind of way but it isn't really that important to the average person over the average use/lifetime of the product.  Hubby's an electrician type (as well as IT/network and all that other stuff), I asked him, and he doesn't give a fart/worry for something like this, so I won't either.

I'm not saying one shouldn't do it, mind, everyone's got their things, but I'm just not going to care enough to bother because it doesn't trigger any of my personal "ocd" tendencies and I've never had a problem from not doing it. :) 

Edited by LadyCrimson
“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted

Daisy chaining is usually fine, it's just best practice not to daisy chain if you don't have to. No point changing it though if there's no problem. If you want a recent example of when it was a fairly big problem though a lot of the instability people were having with 5700XTs were not actually due to AMD's drivers, but due to daisy chaining the two 8 pins on the same cable.

Posted (edited)

On Thursday I ordered a 5800X from Overclockers UK, and impressively it arrived today, with a public holiday in between as well. It's more than I had planned to spend on a CPU, but 5600X stock has completely dried up, and OCUK don't ship that overseas anyway (nor anything above the 5800X either). I could justify it because OCUK does not charge the 10% tax on behalf of the Australian government, and therefore the price ends up being a rather good $610AUD. This compares quite well to the MSRP of the 5600X which is $469, and the local 5800X at $699 (with street prices trending at $749).

So with that done, I now have everything except the video card ready for my new build. Truthfully though, I can't be bothered building it right now if I have to transfer my existing 290X over, so they'll sit on the shelf while I decide what to do. It figures that just as I've made peace with the idea of spending $1200 on a 6800XT, the street price goes up to nearly $1400. So I'm pretty stuck at the moment.

 

EDIT: Screw it, ordered a Sapphire 6800 Nitro for $1110 AUD. Don't want to age myself any further by overthinking it.

Edited by Humanoid
  • Like 2

L I E S T R O N G
L I V E W R O N G

Posted

With a 6800 Nitro you should be able to overclock it to near or even with stock 6800 XT performance.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted
20 minutes ago, Keyrock said:

With a 6800 Nitro you should be able to overclock it to near or even with stock 6800 XT performance.

I did find it strange that the same store restocked the entire Sapphire range at the same time, yet only put a $10 difference between the Pulse and the Nitro. I probably would not have bitten on the Pulse at that price, just like I turned down a Powercolor Red Dragon for the same price last week.

That said, overclocking probably isn't going to be a thing for me with this system, at least not for now. Possibly the opposite actually, I'm strongly considering operating the CPU in permanent Eco mode. I'm resisting the heavy airflow trend and sticking the thing in a Define 7 with just the stock fans, so I'd rather not pump too much heat into it. I'll probably leave the video card at stock though, unlike the 290X which I run with a power limit underclock.

Finalised specs (unless something turns out to be DOA):

Ryzen 7 5800X
Noctua U14S
MSI B550 Gaming Edge WiFi
4x8GB Crucial Ballistix 3600 CL16 (black, non-RGB)
Sapphire RX 6800 Nitro+
1TB WD Black SN750
Fractal Design Define 7, white, no window
750W Bitfenix Whisper M

Reusing my 2TB Crucial MX500 and 1TB Sandisk Ultra II SSDs. Weighing up whether to bother with my two 256GB SSDs. Keeping my Blu-ray combo drive too.

L I E S T R O N G
L I V E W R O N G

Posted
So this project made a liar out of me, said I wasn't going to do the build just yet, but I was going to start out today just updating the BIOS as preparation. Ended up going all the way to a bootable system just for my peace of mind.
 
Overall it's one of the stranger things I've done, plonked in my old 8800GT with one non-working fan on the Accelero S1 cooler that's on it. Had to cable tie the fan wire out of the way and plug it into the motherboard, but it turns out that was the broken fan so no matter. Reckon this'll be a bottleneck?
 
XL1Jfdk.jpg
 
Slight hitch in that I plugged the thing into a old TV because that's what's in the dining room where I did the build. The BIOS screen came out all corrupt whenever I selected an option, and I worried for a moment that something was seriously wrong.
 
WSSTB5V.png
 
However plugging it into an actual monitor seems to work properly, so crisis averted. Booted into Peppermint from a USB stick to test, and no problems there either. I'm not going to install Windows until my video card arrives next week so that's it for now I guess.
  • Like 3

L I E S T R O N G
L I V E W R O N G

Posted

I think I have an 8600 GTS lying around somewhere as well...should really find a way to dispose of the old electronics I have.

Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Posted

The card arrived, so the build is now properly functional. Still have to move the old SSDs and optical drive over, but it's effectively done, having slowly accumulated these parts since October.

God, modern video cards dwarf absolutely everything. Makes this full-sized ATX board look like microATX, and the case interior looks tiny.

FSkhBEE.jpg

 

All including the RGB is completely irrelevant though, because...

KmyagCL.png

 

Yes it's staying in the dining room for now while I install Windows, make sure everything works correctly, and transfer the old bits.

  • Like 2

L I E S T R O N G
L I V E W R O N G

Posted

I'm finally home and have everything in hand. I'm too tired to do the build proper tonight, so for now my goal is to take the cooler off the GPU and replace it with the waterblock.

WdKPFZG.jpg

After a good night's sleep, I'll start the process of putting the desktop together.

  • Like 3

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted
On 1/31/2021 at 4:05 PM, Humanoid said:

God, modern video cards dwarf absolutely everything.

I remember wondering if the motherboard (on its side) would be strong enough to support the 2080ti. It was so heavy vs. the 980ti.

I'm waiting for when gpu's weigh 20 pounds and we'll need four straps coming off of it that'll attach to loop rings inside of our cases to hold it in place.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Posted
4 hours ago, LadyCrimson said:

I'm waiting for when gpu's weigh 20 pounds and we'll need four straps coming off of it that'll attach to loop rings inside of our cases to hold it in place.

Time for us to have "motherGPUs" which bolt onto the case, then the CPU and other functions can live on a daughterboard that plugs into the GPU.

  • Hmmm 1

L I E S T R O N G
L I V E W R O N G

Posted

Graphics cards keep getting bigger, meanwhile storage...

j0zoney.jpg

Kitchen shears for scale.

After slowly and carefully removing the cooler: WARNING: GRAPHIC FULL FRONTAL PCB NUDITY

XuDkGKS.jpg

Applying the thermal compound to the GPU was scary since I'm using Thermal Grizzly Conductonaut liquid metal, which obviously conducts electricity.

aJhiomt.jpg

  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted

Did you spread the compound or just put a pea and press it?

"because they filled mommy with enough mythic power to become a demi-god" - KP

Posted (edited)
29 minutes ago, Sarex said:

Did you spread the compound or just put a pea and press it?

I spread it.

Edit: There are fairy tales about what the best application is (e.g. single grain of rice, 5 dots, etc.). The point of the compound is to fill in the microscopic imperfections in the processor and the cold plate contacting it. Spreading ensures full coverage. If you use a non-conducting paste or grease then you can just slather it on there and if you put on too much it will just push out of the sides when you tighten down the cooler/waterblock. With liquid metal you have to be more careful to not put on too much, since it can short out your board if a bunch gets squeezed out onto the PCB.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Posted
37 minutes ago, Keyrock said:

With liquid metal you have to be more careful to not put on too much, since it can short out your board if a bunch gets squeezed out onto the PCB.

That is why I asked, I don't know if I would have had the nerves to do a pea size in the middle for fear of it spreading too much.

37 minutes ago, Keyrock said:

I spread it.

Edit: There are fairy tales about what the best application is (e.g. single grain of rice, 5 dots, etc.). The point of the compound is to fill in the microscopic imperfections in the processor and the cold plate contacting it. Spreading ensures full coverage.

The issues with spreading is that you can get air bubbles and that is not good for thermals.

 

  • Like 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...