Jump to content

Build Thread


Gorgon

Recommended Posts

Closer to Pillars of Eternity, with a name like SoulTaker, from that gallery website.

 

As far as the delay, I would think the news would have blown up by now ... but I'm not seeing it really anywhere. Imagine all the third-parties that need to know now, like mobo-makers and what-not. I'd wait a week or two and get some concrete info, but I'm TeamGreen, so I know less about the vagaries of the Blue side. 

All Stop. On Screen.

Link to comment
Share on other sites

Closer to Pillars of Eternity, with a name like SoulTaker, from that gallery website.

 

As far as the delay, I would think the news would have blown up by now ... but I'm not seeing it really anywhere. Imagine all the third-parties that need to know now, like mobo-makers and what-not. I'd wait a week or two and get some concrete info, but I'm TeamGreen, so I know less about the vagaries of the Blue side. 

I was Team Green for a very long time.  Phenom II 940 BE was my last AMD processor.  I still remember the glory days of my Opteron 160 overclocked by 800 MHz on stock cooling (and never breaking a sweat), absolutely demolishing any Intel processor of that time in the same price area code, even somewhat more expensive chips.  Sadly, AMD simply has nothing to offer me in the segment I'm looking to buy right now.

 

Anyway, we'll find out soon either way.  I have a plan and a backup plan.  One way or another I'm buying a CPU the first week of June.  It will either be a i7-4790K or a Xeon E3-1230v3.  I will be playing games on my SteamOS Monster Machine the second week of June, that is written in stone.

  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

It's so sad Bulldozer wound up being such a collosal failure.  AMD has even lost much of their edge in the integrated GPUs department, with HD4600 and Iris Pro closing the gap considerably.  Honestly, if HSA doesn't wind up being the game changer that AMD has been touting it as, and if the architecture that succeeds Excavator (by all reports the final of the Bulldozer-based architectures) doesn't wind up being waaaaaaaaaaaaaaaaaaaaaaay better, AMD may die as a CPU company and essentially become ATI, only putting out GPUs.  That would really really suck.  Intel would be raping our wallets and dragging their feet in the desktop segment even worse than they are now.  *sigh*  I miss the days when Athlon64s were mopping the floor with the Pentium 4 space heaters and forcing Intel to scramble, innovate, and drop their prices to be competitive.  Those were the days.

I share your sentiment. It also causes my inner fanboy to stir... ;)

 

Vishera is still "fast enough", especially outside of games, but fast enough isn't a very compelling selling point sadly, people even prefer Pentiums, i3 and other feature crippled chips thanks to the i7 nimbus. I'm perfectly happy with mine, because in demanding contemporary games it's still my GPU that limits the options, and in older games (which I play most of the time) it does not matter. When programming business applications, the Vishera is somewhere between i5 and i7 territory performance wise, and it behaves more predictable than an i7 (Hyperthreading). So Vishera's performance is not an issue for me personally.

 

While I am a fan of efficient, modern technology, and Vishera's perf / W is not good, when deciding between AMD and Intel, the following arguments are enough for me to prefer the underdog: David vs Goliath, monopoly, business practices and - less and less important at AMD sadly - company principles. An 8 core Excavator on FM2+ I'd prefer over Vishera, it would provide a very solid, modern platform. Oh, and in 20nm pretty please... :/

 

Regarding the iGPU: The Iris Pro is rare, expensive, and uses brute force over know-how to achieve its performace, made possible thanks to Intel's ever present process node advantage. I'm not convinced it is a real contender to AMD's and nVidia's offerings, seeing Intel's long history of GPU fails. Haswell has the first iGPUs produced by Intel that, apparently, can now play back HD content without sync issues. And maybe, just maybe (haven't tested one yet), they can correctly guess the resolution of displays/projectors connected by VGA now, because Ivy Bridge couldn't: One of the laptops I use at work has an Ivy Bridge i5, and every time I start it up on an external display (because when plug'n'play-ing, chances are 0% of the following happening), it is a gamble whether that slouch will offer the full resolution or even correct screen ratio on the external display. And it can power two monitors at most (internal + 1, or VGA + DP), while my trusty Llano workhorse of the same manufacturer has no problems of driving multiple displays or recognizing a VGA display's properties. So, if Intel's Haswell GPU featureset has indeed reached the two better IHVs, it is a first, and they have years of proving ahead until they can be considered a serious contender in the GPU space, feature- and technologywise.

Edited by samm
  • Like 2

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

 

It's so sad Bulldozer wound up being such a collosal failure.  AMD has even lost much of their edge in the integrated GPUs department, with HD4600 and Iris Pro closing the gap considerably.  Honestly, if HSA doesn't wind up being the game changer that AMD has been touting it as, and if the architecture that succeeds Excavator (by all reports the final of the Bulldozer-based architectures) doesn't wind up being waaaaaaaaaaaaaaaaaaaaaaay better, AMD may die as a CPU company and essentially become ATI, only putting out GPUs.  That would really really suck.  Intel would be raping our wallets and dragging their feet in the desktop segment even worse than they are now.  *sigh*  I miss the days when Athlon64s were mopping the floor with the Pentium 4 space heaters and forcing Intel to scramble, innovate, and drop their prices to be competitive.  Those were the days.

I share your sentiment. It also causes my inner fanboy to stir... ;)

 

Vishera is still "fast enough", especially outside of games, but fast enough isn't a very compelling selling point sadly, people even prefer Pentiums, i3 and other feature crippled chips thanks to the i7 nimbus. I'm perfectly happy with mine, because in demanding contemporary games it's still my GPU that limits the options, and in older games (which I play most of the time) it does not matter. When programming business applications, the Vishera is somewhere between i5 and i7 territory performance wise, and it behaves more predictable than an i7 (Hyperthreading). So Vishera's performance is not an issue for me personally.

 

While I am a fan of efficient, modern technology, and Vishera's perf / W is not good, when deciding between AMD and Intel, the following arguments are enough for me to prefer the underdog: David vs Goliath, monopoly, business practices and - less and less important at AMD sadly - company principles. An 8 core Excavator on FM2+ I'd prefer over Vishera, it would provide a very solid, modern platform. Oh, and in 20nm pretty please... :/

 

Regarding the iGPU: The Iris Pro is rare, expensive, and uses brute force over know-how to achieve its performace, made possible thanks to Intel's ever present process node advantage. I'm not convinced it is a real contender to AMD's and nVidia's offerings, seeing Intel's long history of GPU fails. Haswell has the first iGPUs produced by Intel that, apparently, can now play back HD content without sync issues. And maybe, just maybe (haven't tested one yet), they can correctly guess the resolution of displays/projectors connected by VGA now, because Ivy Bridge couldn't: One of the laptops I use at work has an Ivy Bridge i5, and every time I start it up on an external display (because when plug'n'play-ing, chances are 0% of the following happening), it is a gamble whether that slouch will offer the full resolution or even correct screen ratio on the external display. And it can power two monitors at most (internal + 1, or VGA + DP), while my trusty Llano workhorse of the same manufacturer has no problems of driving multiple displays or recognizing a VGA display's properties. So, if Intel's Haswell GPU featureset has indeed reached the two better IHVs, it is a first, and they have years of proving ahead until they can be considered a serious contender in the GPU space, feature- and technologywise.

 

You make good points, but I'm not enough of a AMD fan to buy a clearly inferior "good enough" product in lieu of a clearly superior product.  Sure, there are some processes where AMD's 8 physical cores (even if they are 4 modules that share an FPU between each module's 2 cores) beat an i5's 4 cores and even an i7s 4 physical and 8 virtual cores, but those real world situations are few and far between.  AMD's cores are so far behind Intel's cores, in terms of IPC, that they need to get into the 5+ GHz regions to compete with Intel's 3.5-4 Ghz cores and that's at 220W (yikes!) vs. 80-90W.  The high-end Visheras run so hot that it's REQUIRED to get an aftermarket cooler (water-cooled recommended) for anything more demanding than web browsing.  That's just crazy.

 

Intel hes a very clear and decisive fabrication advantage.  it's sad, but it's true.  In the long run, this may all be a moot point, since I think ARM is going to take x86's lunch money, even on desktops, eventually, and if Intel doesn't do like Nvidia did a while ago and AMD is starting to do now, they will be lefty out in the cold, fabrication advantage or not.  ARM is simply a more efficient architecture.  Even Intel's massive fabrication advantage can't completely make up ARM's massive performance per watt advantage, particularly in the highly lucrative mobile sector.  Eventually, ARM chips will get powerful enough to  power desktops as well as mobile devices and servers (in large clusters).  This is going to happen sooner than people think.  I know x86 has the advantage of supporting legacy stuff that may be too much trouble to port to ARM, but that will eventually disappear, especially if Windows ceases to be the dominant platform (and this is already happening in servers).

Edited by Keyrock
  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

I completely understand your perspective and consider it to be very vendor neutral. In fact, I even doubt that there is any reasonable case where an FX will beat an i7.

 

As for ARM taking over by way of mobile devices and servers: Intel is pushing its Atoms into these markets, backed by massive subsidies as usual, so I'm not sure x86 will be out of the picture even in the long run. However, AMD seems to agree with you as that is what their HSA strategy (with ARM and mobile manufacturers on board) and more recently their ambidextrous strategy entails: a standard 64bit ARM for servers as a first try, followed by pin compatible ARM and x86/AMD64 processors, both using GCN for their GPU part, as a second step, and then, if I interpret this correctly, processors allowing for both ISAs.

 

Ok, ahem, and sorry for derailing this thread^^

  • Like 1

Citizen of a country with a racist, hypocritical majority

Link to comment
Share on other sites

So, only the $1000 model will be octacore?  Intel, shafting their customers yet again, because they can.  *sigh*

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Guest Slinky

i7-5930K has half a gig higher base frequency than the octacore i7-5960X and could be around half the price of it too.

 

I'm smelling one rotten bang for buck ratio here. I wonder when games will even start using 8 cores, 2020?

Link to comment
Share on other sites

http://wccftech.com/intel-haswelle-core-i7-5960x-core-i7-5930k-core-i7-5820k-specifications-unveiled-flagship-8-core-boost-33-ghz/

 

Confirmed...

 

Pretty certain that I will not go for Haswell-E then, so it's Devils Canyon or Broadwell if it gets a firm release date by August.

 

Some more news.

http://www.overclock.net/t/1491935/vr-zone-devils-canyon-i5-4690k-i7-4790k-cpuz-screenshot-posted

Edited by Sarex

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

Well, I wouldn't exactly call something appearing on wccftech confirmed.  That said, if true, I wonder why the clocks on the 5960X are so much lower than the 5930K?  Can Intel simply not keep stable clocks any higher when cramming 8 cores on one die, or is it some kind of weird marketing ploy? 

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Well, I wouldn't exactly call something appearing on wccftech confirmed.  That said, if true, I wonder why the clocks on the 5960X are so much lower than the 5930K?  Can Intel simply not keep stable clocks any higher when cramming 8 cores on one die, or is it some kind of weird marketing ploy? 

 

It's to do with density, the same thing happened with the 6-core processor.

  • Like 1

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

Guest Slinky

They might also want to keep the TDP in check...

 

... so they don't have to solder the IHS and can use some cheap ass toothpaste again :p

Link to comment
Share on other sites

They might also want to keep the TDP in check...

 

... so they don't have to solder the IHS and can use some cheap ass toothpaste again :p

 

The Enthusiast series are always soldered.

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

 

Well, I wouldn't exactly call something appearing on wccftech confirmed.  That said, if true, I wonder why the clocks on the 5960X are so much lower than the 5930K?  Can Intel simply not keep stable clocks any higher when cramming 8 cores on one die, or is it some kind of weird marketing ploy? 

 

It's to do with density, the same thing happened with the 6-core processor.

 

I kind of figured that, I just didn't expect the difference in clocks to be so drastic.  It really makes the 5960X look very unattractive.  I reckon that in more real world situations than not the vastly higher clock speeds of the 5930K will outperform the 2 extra cores and added cache of the 5960X.  With the 5960X you'd be paying almost twice as much for a processor that will ultimately perform worse the majority of the time.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Brodwell-E is for sure going to be better, the question is just how much better. It's a bit iffy if they will be able to overcome the density problem.

 

The bigger question is, should they even be trying to overcome it. As things are going now, they are just making it smaller and denser and the gains are less and less as new generations come out.

"because they filled mommy with enough mythic power to become a demi-god" - KP

Link to comment
Share on other sites

Well, I ordered my mobo today.  Now all that's left is to wait until next week for Computex and then I'll find out whether it will be a i7-4790K or a E3-1231 v3 that will be slotting into said motherboard.

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

NAS ....just about done. 7 out of 8 drives installed, so even tiny little SATA cables start looking like a mess. If only you could have multiple data connectors coming off one cable. :( At least the SATA power cables are neat enough using four-way splitters. Those red SATA cables I bought for a couple bucks each are exceedingly crappy - the clips won't even engage properly half the time - but at least they're short and relatively neat. The black ones that came with the motherboard are more of a pain to work with despite them being of obviously better quality (I particularly dislike L-shaped connectors, of which two of them are). Kudos to Gigabyte for including four of them in a relatively entry-level board though, many boards nowadays only come with two.

 

4x4TB, 1x3TB, 2x2TB plus one waiting for me to transfer the data it before I throw it in there. before installing it for 25TB. Unfortunately I have about 23TB of data to store (of which 80% has been transferred at least) so I'm almost immediately going to be capped, urgh. But at least now that this is done, this weekend I can (if I choose) disassemble my old HTPC which used to be connected to most of this, and build my new one in its chassis.

 

i7b9IAr.jpg

 

 

P.S. The hardest part of the build, other than waiting on slow copies, was extracting the 3TB drive out of its external casing - external HDDs are getting harder and harder to crack open, almost as if they don't want you doing it.... 'Twas a Seagate GoFlex Desk.

 

EDIT: Kill-a-watt showing power consumption hovering a little under 90W under load. Probably various things I could fiddle with in the settings, particularly idles, to get that down, and might try to set up uptime scheduling. That said, it'd work out to be about 40-45 cents a day running 24/7.

Edited by Humanoid

L I E S T R O N G
L I V E W R O N G

Link to comment
Share on other sites

Well, I now physically have everything I ordered.  I went ahead and installed everything in the case.  I have nearly an entire system waiting here ready to go.  It's just missing one little, insignificant part called a Central Processing Unit.

 

It's going to drive me nuts with the rig sitting here waiting for that CPU.  :aiee:

 

rEcpwrv.jpg

 

I won't bother tying the cables up nicely until I get the CPU and install it.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

No nudes of the GTX, c'mon.

It's a tasteful sideboob shot of the 780Ti, that's the best you're getting.  For a full-on cleavage shot I'd have to unscrew it and take it out of the motherboard.  I'd rather not handle it any more than I have to.

  • Like 1

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

Since I failed to deliver a cleavage shot of the 780Ti, I thought I'd try to make it up to you with full frontal of my keyboard:

 

rhwuUwo.jpg?1

 

It came with regular WASD and Win Key keycaps (show below), the ones in the above image came separate, though included with the keyboard, plus a keycap puller.  It's tough to tell from the image, but the red WASD keys have the letters written on the front of them so that you can see them when sitting at a desk.  I may still purchase some custom keycaps in the future.

 

80BLHaa.jpg?1

 

The keyboard feels very sturdy and well put together.  Unfortunately I cannot convey the sound the keys make when pressed, nor the subtle, yet satisfying bump you feel, via picture.

Edited by Keyrock

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

I ordered a pair of orange super key keycaps printed with the Tux logo.  :biggrin:

 

Sort of like the black one in this picture except orange like the keycap next to it:

 

ind_2.jpg

sky_twister_suzu.gif.bca4b31c6a14735a9a4b5a279a428774.gif
🇺🇸RFK Jr 2024🇺🇸

"Any organization created out of fear must create fear to survive." - Bill Hicks

Link to comment
Share on other sites

I don't understand: what's the purpose of these special, rather obnoxious looking keys? :p

Quote

How I have existed fills me with horror. For I have failed in everything - spelling, arithmetic, riding, tennis, golf; dancing, singing, acting; wife, mistress, whore, friend. Even cooking. And I do not excuse myself with the usual escape of 'not trying'. I tried with all my heart.

In my dreams, I am not crippled. In my dreams, I dance.

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
×
×
  • Create New...