Jump to content

Recommended Posts

Posted (edited)
Going to the HD too often? As soon as I start a program, I go to the HD, every level loaded in a game as well. And everytime I open or save a file. Or unrar stuff etc.

Saving a file doesn't count since any non-brain-damaged OS will stream it via the file cache. The other instances you mentioned are rare enough that I will prefer to spend the extra $100 on my graphics card or CPU. Of course, if you already have a top-end system then you might want to spend the extra cash on the HDD, but if I wanted a really top-end storage system (non-RAID), I'd probably pair up a small high end SCSI with a huge low end IDE. Different strokes.. :)

 

ATI Radeon X1900 XTX (I couldn't say no to 3 X's.  It's the latent porn addict in me!!!  Oh, this was my stupid purchase :()

Yay! (w00t) Can't describe the feeling when you power up your machine, knowing that the best graphics card money can buy is powering it. (w00t) (w00t) (w00t) You won't regret it!

Edited by angshuman
Posted

Well, maybe I should just get the current card I've chosen, bypass the 7950, and then get the DX10 card after it's been out for a while. The next generation should still work in my PCI-X 16, right?

Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Posted (edited)

I feel PCI-E will be around for a while yet. It's still new, and it's not in the interests of graphics chip makers or motherboard makers to redefine a new standard this early in PCI-E's lifecycle (unless of course the new standard sucked and was broken...which I don't think PCI-E is).

 

 

Yay! (w00t) Can't describe the feeling when you power up your machine, knowing that the best graphics card money can buy is powering it. (w00t)(w00t)(w00t) You won't regret it!

 

I'm sure I'll regret it when some unexpected expense comes up :(

Edited by alanschu
Posted

My finger is hovering over the "order" button right now.

Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Posted (edited)

 

 

Woo

 

 

EDIT: It seems in my haste though, I accidentally picked up the 5 CD version of FEAR, rather than a DVD version (assuming there is a DVD version).

Edited by alanschu
Posted (edited)

Too late.

 

EDIT: Downloading patch because I'm dumb and didn't do it while installing.

Edited by alanschu
Posted

So, is the x1900 faster than the Ge7900?

 

Conrats, BTW.

Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Posted

With everything set to max, 8xAF and 4xAA at 1024x768, the time trial for FEAR gave me a minimum of 34 FPS, an average of 76, and a maximum of 192.

Posted
So, is the x1900 faster than the Ge7900?

That's a toughie. The X1900 has more shader power and way superior dynamic branching. It also allows you the ability to use AA and HDR at the same time. It's targeted at more "modern", DX9-ish games. Thing is, very few games use dynamic branching (or any other DX9 features) heavily. The 7900 has more brute force and therefore scores slightly better in most games out today. It's still not a huge difference.

Posted

So, I was worried about getting a card that doesn't support DX10 when folks haven't even caught up with DX9? Maybe I should switch to the x1900xtx. Thoughts, please.

 

Dammit, I hate dithering over a purchase, but if the cards are essential on most games right now but the x1900 has more potential, and folks working to meet that potential, then I'm going ATi. I've been using ATi's for years now anyhow.

 

Will NWN2 use DC9 features?

Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Posted
What about 1600 * 1200 benchmarks?

 

 

I'll give them a whirl. I don't typically play in 1600x1200 though, as my monitor's refresh rate is only 60 Hz then. Though it is less of an issue in a game.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...