Jump to content
Sign in to follow this  
Cantousent

Time to start planning my new computer

Recommended Posts

So, I was worried about getting a card that doesn't support DX10 when folks haven't even caught up with DX9?  Maybe I should switch to the x1900xtx.  Thoughts, please.

 

Will NWN2 use DC9 features?

Thing is, very few games use dynamic branching (or any other DX9 features) heavily.

Ugh.. that was incorrect, I'll be lambasted by graphics engine devs. What I meant was Shader Model 3.0. DX9 offered two new shader models, 2.0 and 3.0. While current engines are based completely on DX9, most features are SM2.0-specific. SM3.0 offers functionalities that are a significant upgrade from SM2.0 including extremely long shaders, dynamic branching and a bunch of floating point formats (required for HDR).

 

nVidia have been supporting SM3.0 since the 6-series, but ATi has only introduced support for it in the X1x00 series. Most game devs do not want to leave users of the previous generation high-end ATi cards in the dark, and therefore center their engines around SM2.0, with a couple of SM3.0 features here and there optionally enabled if a capable card is detected.

 

Bottomline? An SM2.0 game will probably run a wee bit faster on NV GPUs, as will an SM3.0 game that only uses a couple of its features. An engine specifically designed around SM3.0 with heavy use of dynamic branching and long shaders will whoop a 7900's butt. However, the chances that any such engine will ever see the light of day are quite slim, since DX10 and SM4.0 are right around the corner.

Share this post


Link to post
Share on other sites

I went just a smidge over budget. However, the computer itself was within $100 of the limit. I counted the speakers, the monitor, and the add-on software as a different category. Jim, I'm still good for that cup of coffee while I'm up north. :D

 

Sunbeam Transformer IC-TR-B Blue Steel ATX Full Tower Computer Case

 

SeaSonic S12-600 ATX12V 600W Power Supply

 

ZALMAN CNPS9500 LED 92mm 2 Ball Blue LED Light Cooling Fan with Heatsink

 

Logitech Z-2300 200 watts RMS 2.1 Speaker

 

ASUS EN7900GTX/2DHTV/512M Geforce 7900GTX 512MB GDDR3 PCI Express x16 Video Card

 

ASUS A8N32-SLI Deluxe Socket 939 NVIDIA nForce SPP 100 ATX AMD Motherboard

 

AMD Athlon 64 X2 4400+ Toledo 2000MHz HT Socket 939 Dual Core Processor Model ADA4400CDBOX

 

ASUS 16X DVD


Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Share this post


Link to post
Share on other sites
So, is the x1900 faster than the Ge7900?

Even though this is a little late (blame Space Rangers 2's Dominators :"> ) I would say that the nVIDIA SLi solution is more stable and useable than ATi's CrossFire (there were plans to enable cards of different types to work together for both platforms, but only nVIDIA has come up with the result, so far, IIRC); so your purchase is arguably a better future-proof one. In other words, when the nVIDIA 8700XXXTBXXX eXtremeXed comes out in a few months, you should be able to pair it with this card in SLi.

 

:p


OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Share this post


Link to post
Share on other sites
So, I was worried about getting a card that doesn't support DX10 when folks haven't even caught up with DX9?  Maybe I should switch to the x1900xtx.  Thoughts, please.
Thing is, very few games use dynamic branching (or any other DX9 features) heavily.

Ugh.. that was incorrect, I'll be lambasted by graphics engine devs. What I meant was Shader Model 3.0. DX9 offered two new shader models, 2.0 and 3.0. While current engines are based completely on DX9, most features are SM2.0-specific. SM3.0 offers functionalities that are a significant upgrade from SM2.0 including extremely long shaders, dynamic branching and a bunch of floating point formats (required for HDR).

 

nVidia have been supporting SM3.0 since the 6-series, but ATi has only introduced support for it in the X1x00 series. Most game devs do not want to leave users of the previous generation high-end ATi cards in the dark, and therefore center their engines around SM2.0, with a couple of SM3.0 features here and there optionally enabled if a capable card is detected.

 

Bottomline? An SM2.0 game will probably run a wee bit faster on NV GPUs, as will an SM3.0 game that only uses a couple of its features. An engine specifically designed around SM3.0 with heavy use of dynamic branching and long shaders will whoop a 7900's butt. However, the chances that any such engine will ever see the light of day are quite slim, since DX10 and SM4.0 are right around the corner.

Interesting inteview summary on the PC Format Editor's blog (Have to find the link ... their site is a mess :angry: ), when talking to ATi, the reason they were so delayed in implementing SM3 was because (allegedly) the nVidia implementation is inferior to spec; furthermore, because of the nVidia's rushed implementation and their subsequent support for devs, devs now think that SM3 is a high-cost feature and are less inclined to incorporate them into the core graphics engine tech ... at present.

 

ATi's SM3 is easier to use computer- and developer- resource-eating-wise and are better implemented in games.

Will NWN2 use DC9 features?

I believe this is true: the devs have made it pretty clear that they spent a lot of resources on a new graphics engine with lots of fog and water and sky effects, etc.

 

Also, according to the same editor, ATi are also enabling their GPUs to do PhyX-like calculations.

Far better, according to ATI's Richard Huddy, to spend that cash on a second graphics card capable of the huge amounts of parallel processing required to run physics routines, but which still can also do graphics too. According to Richard, a X1900XTX is capable of around 550 gigflops per second, compared to an estimated 20-50Gflops on an Ageia PhysX PPU. Both of which are far higher than even the fastest multicore CPUs at the moment. Even down at the lower end of the performance pecking order, the rule of thumb for developers should already be that moving physics processes off the CPU to the GPU is the sensible thing to do - the framerate performance hit should only be about 2-5% compared to running physics on a dedicated processor.

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Share this post


Link to post
Share on other sites

You know, I was thinking that I'd just go for the single video card option but have SLI available just in case. On the other hand, if I can put the new soundcard in with the old one without being forced to use both at the same time, I'll probably stick with nvidia. It was a big jump for me to switch away from ATi, but I've had good luck with both and I'm not about to wed myself to brand loyalty. Still, it was weird switching.

 

On the other hand, I didn't want to be behind the 8-ball by making assumptions about the future. If I can stick the new DX10 card in with the 7900GTX, then I'm a new nvidia man for the life of this rig.


Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Share this post


Link to post
Share on other sites
Symantec Norton Internet Security 2006

 

 

 

 

...oh, crud...why, Jebus, why???...ya didna install that Norton crapola in yer rig yet, did ya???...chalk full o' holes an' a resource hog like noone's business...ya'd 'ave been faaaaaaaar better off wit' Kaspersky, Trend Micro, etc...send it back, laddie; send the bugger back!!!... :luck:

 

 

...WHO LUVS YA, BABY!!...


A long, long time ago, but I can still remember,
How the Trolling used to make me smile.
And I knew if I had my chance, I could egg on a few Trolls to "dance",
And maybe we'd be happy for a while.
But then Krackhead left and so did Klown;
Volo and Turnip were banned, Mystake got run out o' town.
Bad news on the Front Page,
BIOweenia said goodbye in a heated rage.
I can't remember if I cried
When I heard that TORN was recently fried,
But sadness touched me deep inside,
The day...Black Isle died.


For tarna, Visc, an' the rest o' the ol' Islanders that fell along the way

Share this post


Link to post
Share on other sites

Symantec Norton Internet Security 2006

 

 

 

 

...oh, crud...why, Jebus, why???...ya didna install that Norton crapola in yer rig yet, did ya???...chalk full o' holes an' a resource hog like noone's business...ya'd 'ave been faaaaaaaar better off wit' Kaspersky, Trend Micro, etc...send it back, laddie; send the bugger back!!!... :luck:

 

 

...WHO LUVS YA, BABY!!...

 

Fwiiw, I have recently migrated all of my rigs and clients to Avast 4.7 Home Free Edition. Smaller footprint and much more manageable UI, imo. As well, there is the Professional Edition for those in the business sector.


The universe is change;
your life is what our thoughts make it
- Marcus Aurelius (161)

:dragon:

Share this post


Link to post
Share on other sites
You know, I was thinking that I'd just go for the single video card option but have SLI available just in case.  On the other hand, if I can put the new soundcard in with the old one without being forced to use both at the same time, I'll probably stick with nvidia.  It was a big jump for me to switch away from ATi, but I've had good luck with both and I'm not about to wed myself to brand loyalty.  Still, it was weird switching.

 

On the other hand, I didn't want to be behind the 8-ball by making assumptions about the future.  If I can stick the new DX10 card in with the 7900GTX, then I'm a new nvidia man for the life of this rig.

Just been reading the latest review of nVidia quad SLI and their hijinx.

 

Summary

  • Quad SLI is hit-and-miss.
  • Hit: FEAR. Can run a Dell 30" 1920x1280 res screen in all its glory.
    That's it.
  • Miss: Any other game, even if it has a specific "quad SLI profile", will run worse than the dual SLI and even the 7900GTX singly.
  • Currently the only commercially available quads are two double PCBs, sandwiched together with a couple of holes cut in one to let the air into the HSF on the other. So it's expensive, noisy AND broken.

The 7900GTX is the first card that will take advantage of the new nVidia 590 SLI motherboard chipset (also just announced), that will speed up the two PCI-E x16 bus cards from 8MB/s to 10MB/s. Also nVidia have announced "SLI RAM" that is faster than normal RAM (not as gimmicky as it sounds).

 

I'll post the review when I get a break against the Dominators. :lol:"


OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Share this post


Link to post
Share on other sites

Well, everything arrived and the HDD is 72% formattted. I have to leave and do other things, but I was happy to see that, with everything running, the CPU idles at 36C and motherboard went from 36 up to 40C and stopped there. The case seems to cool pretty damned well so far.

 

I'll post pictures later.


Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Share this post


Link to post
Share on other sites

I was wandering if any of you had any ideas of what wireless router and what wireless PCI network adapter or NIC, I should go with or maybe just what brand is one of the better ones to choose from.. I have 2 home pc's and I have them around 50 ft away from each other on cable internet, I did see a linkys router that was a cable modem and router together.. would that be a better choice?? I have been reading that this 802.11b or g is the way to go but I am really new to all this wireless stuff... so hope someone can help me out with some info..

Share this post


Link to post
Share on other sites
I was wandering if any of you had any ideas of what wireless router and what wireless  PCI network adapter or NIC, I should go with or maybe just what brand is one of the better ones to choose from.. I have 2 home pc's and I have them around 50 ft away from each other on cable internet, I did see a linkys router that was a cable modem and router together.. would that be a better choice??  I have been reading that this 802.11b or g is the way to go but I am really new to all this wireless stuff...  so hope someone can help me out with some info..

 

 

 

...I uses a Linksys router an' dropped a Linksys wireless NIC in the fiancee's machine an' things run smooth...give 'er a hook, I says... :lol:

 

 

...WHO LUVS YA, BABY!!...


A long, long time ago, but I can still remember,
How the Trolling used to make me smile.
And I knew if I had my chance, I could egg on a few Trolls to "dance",
And maybe we'd be happy for a while.
But then Krackhead left and so did Klown;
Volo and Turnip were banned, Mystake got run out o' town.
Bad news on the Front Page,
BIOweenia said goodbye in a heated rage.
I can't remember if I cried
When I heard that TORN was recently fried,
But sadness touched me deep inside,
The day...Black Isle died.


For tarna, Visc, an' the rest o' the ol' Islanders that fell along the way

Share this post


Link to post
Share on other sites
... 802.11b or g is the way to go but I am really new to all this wireless stuff... 

802.11b is antique. You'll want a switch/router capable of 802.11g, or better still, 802.11a. Most of them are backwards-compatible, so an 802.11a-capable router will actually be labeled as an 802.11a/b/g device.

Share this post


Link to post
Share on other sites

Finally, everything is running. I'm particularly happy with the sound, although I'm not sure it's the card or the speakers or maybe some combe.

 

As promised, pictures.

 

The new rig:

 

 

 

The mountain of madness:

 

 

 

It runs the max settings on FEAR at 1600x1200 with the lowest FPS 21. The test showed 5% less than 25fps, 75% 25-40fps, and 20% over 40fps. When I kept the resolution at 1600x1200 and made some minor quality adjustments, I managed a lowest fps of 37. Nice card so far. :Eldar's sharing his gaming joy with his homies who had his back: Thanks for the advice, homies.


Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Share this post


Link to post
Share on other sites

(w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t) (w00t)

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Sign in to follow this  

×
×
  • Create New...