Jump to content

Build Thread


Gorgon

Recommended Posts

All of the generic PSUs I have had over the years have blown up. The best one was a 500w Codegen one I got with a case I purchased in 2001. That thing lasted almost 10 years I think.

 

I've never had a brand name PSU blow up.

  • Like 1
Link to comment
Share on other sites

Guest Slinky

Don't have first hand experience with them, but I haven't seen anything but praise and recommendations for Samsung EVO series. Their toolbox sofware also is great from what I've heard.

 

Check the price difference to pro version before you buy, I think they have longer warranty.

Edited by Slinky
Link to comment
Share on other sites

The price difference between the EVO and the Pro is very substantial, much too substantial to my tastes.  The EVO is about 50 cents per GB while the Pro is about 80 cents per GB.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Some people get a little nervous about the EVO because it uses TLC NAND rather than the conventional MLC NAND in most drives. TLC is used as a cost-cutting measure, theoretically having less endurance than MLC, but Samsung's controller is smartly programmed and makes up for most of the shortcomings of the memory type. When we talk endurance here, it's estimating the life of the SSD assuming an approximate consumer-oriented load, typically 10-20GB/day of writes (already very conservative). At that kind of load, the EVO might last 20 years instead of a competitor's 30 years, so it tends to be not a big deal in practice.

 

Personally I'm comfortable with the EVO, which I'd pick over the majority of alternatives. The ones I can comfortably say are better are the 840 Pro as mentioned, the Seagate 600, and the Sandisk Extreme II. Can't find good deals for the former two, but I see Amazon has the Sandisk for $287 which is a great price for arguably the second best consumer SSD available. Compared to the EVO at $249, I reckon I'd take the Sandisk, which is marginally better in just about every metric. (If you can get the EVO cheaper than that then it's obviously tilting the balance back in its favour)

 

 

P.S. I have both an Extreme II and a EVO 840 in my system, no complaints about either. (Also a Crucial m4, but that's comparatively ancient)

 

P.P.S. Samsung has an excellent software package called the SSD Magician which is handy and is a point in its favour, allowing easy health checks and firmware updating on their SSDs. I wouldn't say it's make or break personally, it's more of an ease of use thing: my Crucial drive required me to boot from a CD to update its firmware for instance, for Samsung it's just a couple of clicks on a desktop application. The Sandisk hasn't required an update so I'm not sure about it, but it looks like they have an updater application available for their other model SSDs.

 

P.P.P.S. I should also note an SSD's longevity scales almost linearly to its capacity. Lazily quoting Anandtech's estimates for the EVO, at a heavy usage of 50GiB/day, the 120GB model would expire in 4 years, the 250GB in 8 years, and the 500GB in 16 years.

Edited by Humanoid
  • Like 3

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

As far as SSDs go, I'm thinking of getting a Samsung 840 EVO 500GB.  Thoughts?

 

All of Samsung ssd drives are rock solid, I personally own OCZ Vertex 4 128gb and while I had no trouble with it, I still regret not shelling out a little more for the Samsung 840 pro (128gb).

 

 

P.P.P.S. I should also note an SSD's longevity scales almost linearly to its capacity. Lazily quoting Anandtech's estimates for the EVO, at a heavy usage of 50GiB/day, the 120GB model would expire in 4 years, the 250GB in 8 years, and the 500GB in 16 years.

 

Performance too.

Edited by Sarex
  • Like 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

I'm thinking of getting this video card.  With the sale price and the rebate, it comes out to about $660 (plus Uncle Sam's cut).  Factory overclocked, not a big deal, but a plus, as I'd rather not mess with that if I don't have to, as it's a bit more complicated (tedious) under Linux.  Comes with a free copy of Watch Dogs, but that's any 780Ti, of 780, or 770, or 760 right now.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Open air coolers, which most custom designs are, might be a little quieter in solo operation, but because they exhaust hot air internally are less suitable for SLI. So depending on your likelihood of going SLI in future, decide whether you want to pick up a model with the stock blower cooler instead.

  • Like 2

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

EVGA, Palit and Gigabyte have the best versions of this card it seems.

 

If you are interested in reviews here are the links:

http://www.guru3d.com/articles_pages/evga_geforce_gtx_780_ti_sc_acx_superclock_review,1.html

http://www.guru3d.com/articles_pages/palit_geforce_gtx_780_ti_jetstream_review,1.html

http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_780_ti_windforce_3x_review,1.html

 

 

Open air coolers, which most custom designs are, might be a little quieter in solo operation, but because they exhaust hot air internally are less suitable for SLI. So depending on your likelihood of going SLI in future, decide whether you want to pick up a model with the stock blower cooler instead.

 

A good point to bare in mind, the stock coolers take the air from the tip of the card a blow it out the vents on the back of the case, while custom coolers take air from the face of the card and blow it inside the case.

Edited by Sarex
  • Like 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

The SLI point is a good one, and I just can't answer that one at the moment.  It will come down to whether there will be anything in the future to push my system enough to warrant a second card.  Also, at that point, will it be more economical to sell my current card and get a GTX 980 or whatever instead?  Basically, I'm not going to worry about it too much, beyond getting a motherboard with at least 2 PCIe 3.0 slots.  I'll likely get that EVGA 780Ti I linked to, and if I do SLI it in the future, and temperatures become a problem, maybe I can engineer some kind of angled sleeve over it to direct the air away from the other card, space permitting?  I'll cross that bridge when I get to it.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

The SLI point is a good one, and I just can't answer that one at the moment.  It will come down to whether there will be anything in the future to push my system enough to warrant a second card.  Also, at that point, will it be more economical to sell my current card and get a GTX 980 or whatever instead?  Basically, I'm not going to worry about it too much, beyond getting a motherboard with at least 2 PCIe 3.0 slots.  I'll likely get that EVGA 780Ti I linked to, and if I do SLI it in the future, and temperatures become a problem, maybe I can engineer some kind of angled sleeve over it to direct the air away from the other card, space permitting?  I'll cross that bridge when I get to it.

 

The problem is if the cards are too close, one will block the intake holes of the other card, there is nothing to be done about that.

 

Example:

IMG_6189.jpg

  • Like 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

The ASRock mobo that is my current front-runner has 2 PCIe x1 slots between the first (closest to the CPU) PCIe 3.0 slot and the second.  With the EVGA model 780Ti being a relatively "slim" model that takes up 2 slots (some aftermarket coolers stick out further making the card take up almost 3 entire slots), that would give me a full slot's width of empty space between the 2 cards if I were to SLI with that mobo and those cards.

 

Z97%20Extreme4(m).jpg

Edited by Keyrock
  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Not just the obstruction but often just plain old convection plus radiant heat from the backside of the bottom card. In an multi-card configuration it's always going to be the top card that's going to be problematic, it can be significantly hotter than the bottom card.

 

In theory it might be a workable compromise if you have an open air cooler up top and a blower model at the bottom. The blower at the bottom means it recycles less hot air into the case making the top cooler work better as it intakes somewhat cooler air. At any rate it should be better than the reverse, forcing hot air from the open-air cooler into a top-positioned blower would probably be bad as the blower probably has higher temperatures to begin with.

 


 

Did a little bit of work on my NAS project yesterday, the NAS box itself is assembled. Cabling a bit of a mess until I finish the drive loadout and use SATA power splitters instead of a bunch of separate cable runs, etc.

 

 

 

Gigabyte FM2A88XM-D3H

AMD A4-4000

Silverstone Argon AR-01, fan swapped to 120mm Nexus

1x8GB Kingston DDR3L

Fractal Design Define R4 Black Pearl

2x 4TB WD Red

Nas4Free embedded install on an old crappy 4GB USB stick I found lying around the house. Might update to a good stick in future, but my Sandisk Extreme USB 3.0 sticks have important stuff on them.

 

 

 

Tripped up by the simplest of things though. We take Gigabit ethernet for granted these days so it slipped my mind that my ancient Netgear modem-router is only 100MBit, probably not enough for streaming. So uh, time to go shopping again.

 

- TP-Link Archer D7 ADSL2+ Wireless AC modem-router

- Asus PCE-AC68 wireless PCI-E network adapter

 

My Thinkpad T440s has wireless AC support, and I want to give everything the best possible chance, so I splurged a bit. Over $100 for a consumer level network adapter is daylight robbery, but it seems to be literally the only option available. There are technically wireless AC USB adapters available for about half the price, but with internal antennae and being rated at only AC1200 (867Mbit AC + 300Mbit N) versus the AC1900 of the Asus (the maximum spec 1300Mbit AC + 600Mbit N). The router itself is only AC1750 though, being rated at 1300Mbit AC and 450Mbit N.

 

If all this is confusing, what it means is that these things are dual-band, they operate at the old wireless-N speed on the 2.4GHz spectrum or at the new wireless-AC speed on the 5GHz spectrum. However a consequence of the higher spectrum is that wireless-AC has much more of a range dropoff, being notably more sensitive to obstructions such as walls than previous wireless standards. But yeah, as I said, wanted to give it the best possible chance, so I'll try it in that mode first up, then fall back if it doesn't provide sufficient speed to stream my raw Blu-ray rips. Plan: test wireless-AC > test wireless-N > test Ethernet-over-powerline (I have a couple 500Mbit EoP adapters, but I suspect the wiring in my house will heavily restrict them) > long ugly cable run (bought a 20m cable, god I hope I don't have to use it).

 

If I wasn't living in a rental I'd be able to bypass this all and just get in-wall Ethernet cabling. :(

 

 

P.S. Also threw in a generic ASMedia chip dual SATA 3.0 controller card, to test out compatibility with Nas4Free for the future when I inevitably run out of native SATA ports.

 

P.P.S. Since the NAS is running off a USB stick now, it means the basic Sandisk Ultra Plus SSD I was considering using in it is now spare. Might as well toss it into my desktop. Or more accurately, toss that 120GB drive into my planned HTPC instead, and take the 250GB model (also an Ultra Plus) planned for the HTPC into my desktop. Terabyte SSD storage in my desktop, ahoy!

Edited by Humanoid

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

If I wanted to power a display on the moon I could see from my house, then yes, I would SLI a pair of 780Ti's. Other than that, doesn't really make sense. Sure looks cool inside the case, though. 

 

Can we do the monitor next ... this vicarious build is pretty fun.

All Stop. On Screen.

Link to comment
Share on other sites

Let's see, either one Dell 32" 4K screen.... or for the same price, *four* 32" BenQ BL3200PT QHD AHVA screens. Or more realistically, buy three of them and $500 worth of ice cream. Actually, hold on the ice cream and buy a colorimeter instead to calibrate them, as the settings out of the box are apparently very suboptimal.

 

Seriously though, it's a fantastically well positioned product for an unheard-of asking price. Heck, it's almost half the price of a 30" Dell Ultrasharp U3014. It's not the best at any one thing in particular, but the hard-to-please folks at TFT Central are sufficiently impressed.

Edited by Humanoid

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

If I wanted to power a display on the moon I could see from my house, then yes, I would SLI a pair of 780Ti's. Other than that, doesn't really make sense. Sure looks cool inside the case, though. 

 

Can we do the monitor next ... this vicarious build is pretty fun.

I'm still very much undecided on monitors.  I know I want a 27" 1440p monitor, beyond that...  *shrugs*  1440p at 27" is about the right pixel density for my tastes.  I'm not worried about text being too small, especially since, in many cases, I can simply adjust the DPI.  Obviously, in many games I might not be able to adjust the font size, but my eyes are good enough to read small text...  as long as I have my glasses on.  I'll want it to have at least one HDMI (just one should be fine) input, just in case I get a WiiU in the future (When they put out a Metroid, and you gotta believe they will eventually, I'll likely break down and get one, because Metroid.)  

I'm thinking something along the lines of this.  The contrast ratio is pretty good.  Haven't heard too many reports of dead pixel problems on this one, though I may splurge and spend and extra couple bucks for the perfect pixel model.  From what I've read this model is pretty good in the (lack of) ghosting department.  No problems scaling down to 1080p or 720p.  The biggest concern with it seems to be backlight bleeding, though that's kind of a crapshoot with IPS monitors in general.

 

Edit:  Found it on Ebay for $340 w/ free shipping.  It's got both Dual-link DVI-D and HDMI 1.4a, which, if I'm not mistaken, can both do 1440p at 60 Hz, correct?

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Honestly, all these no name Korean monitors are all basically the same crapshoot.  It's down to either AH-IPS or AHVA.  I prefer VA, so I wound up going with this.  There are several no name "brands" that are basically identical, so I went with the one that I liked best from an aesthetic point of view.  All these Korean monitors basically use the same matrices as the name brands.  The same manufacturer that sells to Benq, Asus, Samsun, and so on, when they get a batch returned because it didn't meet the customer's standards due to too high defect rate or whatever (they generally accept or reject an entire batch), they then sell that batch off to one of these no name Korean companies.  What I can expect is a good internal matrix, and a roll of the dice on the quality of the housing and the quality of the assembly.  The assembly you can usually take apart yourself with a screwdriver and put it back together proper if they did a half-assed job putting it together (relatively likely).  It's generally a simple job, you don't need to be a rocket scientist, you just need to have some patience.  

 

Basically, it's a roll of the dice.  There is going to be some backlight bleeding, that's expected, it's basically how much.  Hopefully I don't get a bunch of dead pixels and horrific backlight bleeding, because RMAing it won't be fun.  I'll let you folks know how it turns out.  Wish me luck.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Good luck. Korean should be just fine.

 

Some entertaining and/or honest translations. Please check display port of graphic card for compatibility with Monitor, Because monitor be unable to support all display ports of your PC. Copyright © All Rights Reserved. 

All Stop. On Screen.

Link to comment
Share on other sites

Good luck. Korean should be just fine.

 

Some entertaining and/or honest translations. Please check display port of graphic card for compatibility with Monitor, Because monitor be unable to support all display ports of your PC. Copyright © All Rights Reserved. 

780Ti comes with DVI-D Dual-Link, DVI-I, DisplayPort, and HDMI outputs.

 

The monitor comes with DVI-D Dual-Link, HDMI 1.4a, and D-Sub inputs.  Both the DVI-D Dual-Link and HDMI 1.4a will allow me to run 1440p at 60Hz.  I'll use the DVI-D to leave the HDMI open for potentially hooking up a WiiU in the future, if I ever get one (read: When Nintendo releases a proper Metroid for WiiU).

 

As an aside, why do monitors still include a D-Sub input?  Who still uses D-Sub?

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

I bought this one:

 

http://www.amazon.com/AOC-Q2770PQU-27-Inch-2560x1440-Resolution/dp/B00HY7PBF6/

 

Not the cheapest, not the most expensive. I basically bought it because it was the only 2560x1440 monitor that supported MHL at the time. But boy, have I been impressed. If I didn't know better, I would have thought this was a professional monitor. Perfect colours (out of the box), no backlight bleeding, almost perfect backlight uniformity (slighty weaker at the edges, if you look carefully), fantastic viewing angles and it's fast. No ghosting whatsoever.

 

I'm still not sure if AOC have upped their game or if I've been lucky.

Swedes, go to: Spel2, for the latest game reviews in swedish!

Link to comment
Share on other sites

VGA connectors are still used a lot in business because, lot of older projectors and the like in use. Most business laptops worth their salt will still have D-sub alongside mini-DP or mini-HDMI. Conversely, laptops more than a few years old will probably have no video output other than VGA.

 

But yeah, HDMI (at least more recent revisions of it) support 1440p, however my U2711 screens don't because of some hardware limitation. So the point is all elements in the chain have to support it, the video card, the cable and the monitor. That said, if it's only going to be for the WiiU then it's irrelevant anyway since it's strictly 1080p.

  • Like 1

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...