Jump to content

Alright nerds.


Recommended Posts

Do girls like having sex with you, alanschu? :()

Edited by mkreku

Swedes, go to: Spel2, for the latest game reviews in swedish!

Link to comment
Share on other sites

How long do you wish to wait? Intel's Conroe is coming up in a couple of months, and (I believe) it's worth waiting for.

 

Bokishi's right, video cards are now going through a "dark age". Nvidia is trying to milk their aging architectures for all they are worth by mindlessly bumping up pipe counts and clock speeds, while ATi are fumbling with cool new architectures that are not quite mature enough to be implemented efficiently using current technologies.

I wouldn't exactly call it a "dark age", there is some quality hardware out there, and some of it is good value.

 

The problem is that games that take advantage of Shader Model 3 (like FEAR, Need for Speed: Most Wanted, and Oblivion, for example) demand an exponential amount of processing power from the GPU. Pre-SM3 and any nVidia 6x or 7x card could render 1600 x 1200 with ease (nVidia have announced that their SLi archtecture will be able to process PhysX calculations, for example: fat chance of there being any spare power, though). Now, even the TOP cards are incapable (singly) of rendering 1600x1200; this includes the ATi X1900XTX and the nVidia GeForce 7900GTX.

 

Even though ATi have the upper hand with their X1900XTX, AND CrossFire is now a viable option, nVidia are about to release their next generation of GPU (next couple of months). This will (most likely) have Shader Model 4, and be compatible with DirectX 10; and, although it will be ludicrously expensive and not very good value for money and you won't buy it anyhow, it will push the price of other cards down ... so maybe you could buy an SLi or CrossFire combo rig, instead ...

 

Also, currently the nVidia SLi and ATi CrossFire strategies are incompatible with each other, so if you want to change your dual-graphics solution from one to the other (in the medium term) you will need a mobo upgrade, too.

 

Thirdly, even though AMD have the upper hand with their 64-bit CPUs, they are still using a 110nm die, whereas the new Intel Conroe are about to hit economies of scale with their new 90nm die manufacturing process (smaller die means higher speed at same frequency, and higher speeds for same power consumption/heat dissipation). The Conroe will be as fast as the Pentium M (but not as stupidly expensive) AND dual-core: which Wndows Vista will take advantage of with its 64-bit application space.

 

If I haven't scared you off with the above, then I direct you to my previous thread:

Build a pc for peanuts

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

It is the best card at the moment ... and it can run in CrossFire! (w00t)

 

Then again, I will probably buy it when the next gen cards are released, so that it drops down to about

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

The problem is that games that take advantage of Shader Model 3 (like FEAR, Need for Speed: Most Wanted, and Oblivion, for example) demand an exponential amount of processing power from the GPU. Pre-SM3 and any nVidia 6x or 7x card could render 1600 x 1200 with ease (nVidia have announced that their SLi archtecture will be able to process PhysX calculations, for example: fat chance of there being any spare power, though). Now, even the TOP cards are incapable (singly) of rendering 1600x1200; this includes the ATi X1900XTX and the nVidia GeForce 7900GTX.

 

Even though ATi have the upper hand with their X1900XTX, AND CrossFire is now a viable option, nVidia are about to release their next generation of GPU (next couple of months). This will (most likely) have Shader Model 4, and be compatible with DirectX 10; and, although it will be ludicrously expensive and not very good value for money and you won't buy it anyhow, it will push the price of other cards down ... so maybe you could buy an SLi or CrossFire combo rig, instead ...

Fair enough. What bugs me is the fact that a couple of years back, if you wanted to play the latest games at 1600x1200, all you had to do was buy the most expensive (e.g. $400 for the Ti4600) video card. Today, if you want to play Tomb Raider, a dual-7900GTX setup worth $1200 won't give you a flawless experience. A year from now, game developers will be developing games with quad-SLI in mind. The thing is, if a quad-SLI setup would cost $500, I have no qualms.

 

I think this SLI game is ridiculous. I don't care whether I'm using 1 card or 2 cards or 15 cards. I want a graphics solution that enables me to play the latest games at the highest resolutions. How much do I have to pay for that? Over the past several years, the answers to this question have fallen on a linear-ish curve. With the coming of SLI, its suddenly starting to go exponential. $200 - $300 - $400 - $500 - $600 - $1200 - $2500... WTF?!??

 

Thirdly, even though AMD have the upper hand with their 64-bit CPUs, they are still using a 110nm die, whereas the new Intel Conroe are about to hit economies of scale with their new 90nm die manufacturing process (smaller die means higher speed at same frequency, and higher speeds for same power consumption/heat dissipation). The Conroe will be as fast as the Pentium M (but not as stupidly expensive) AND dual-core: which Wndows Vista will take advantage of with its 64-bit application space.

I believe all AMD and Intel processors are currently fabbed on 90nm. Conroe will be 65nm. But yes, your point is still valid, the technology shrink will enable it to be clocked higher. Also, Conroe has a bunch of non-trivial architectural improvements over the Pentium M, which make it faster clock-for-clock, especially for floating-point and SSE applications. And it's going to be plugged into a dual-channel memory system (hopefully with an improved controller). Overall, Conroe is a significant evolutionary step up from the Pentium M.

Link to comment
Share on other sites

Maybe it runs crossfire even better?

 

Anyhow, at wholesale prices I could probably afford two of those by the time they go down in price. Hmm...

 

A lot of this will depend on whether Calax can scramble the money to make it with Comic Con with me. If he doesn't I'll have an extra $300 or so dollars lying around.

DEADSIGS.jpg

RIP

Link to comment
Share on other sites

...unless you plan on runnin' games like Oblivion an' such at 1600x1200, ye'd be wastin' yer money goin' fer the X1900XTX, or e'en the nVidia 7900GTX...my Sapphire X800GTO2 softmodded (all 16 pipelines enabled) an OCed ta a li'l o'er X850XT PE speeds (575 core, 615 memory) runs Oblivion just fine at 1280x960 on me ol' ViewSonic G90fB...if I was ye, I'd jus' grab an AMD dual core (if ya can afford the 4400 X2, go fer it; but the 3800 X2 be fine as well) a 7900GT on an ASUS A8N-E mobo wit' 2 gigs o' ram (OCZ be me preference there) an' let the good times roll...that is unless ya wants the "top o' the line new tech"; I canna really see Conroe or the next gen AMD offerin' makin' too much o' a difference right now, ta be perfectly honest...today's dual core processors should be good ta go fer a few years anyways... :p

 

 

...WHO LUVS YA, BABY!!...

A long, long time ago, but I can still remember,
How the Trolling used to make me smile.
And I knew if I had my chance, I could egg on a few Trolls to "dance",
And maybe we'd be happy for a while.
But then Krackhead left and so did Klown;
Volo and Turnip were banned, Mystake got run out o' town.
Bad news on the Front Page,
BIOweenia said goodbye in a heated rage.
I can't remember if I cried
When I heard that TORN was recently fried,
But sadness touched me deep inside,
The day...Black Isle died.


For tarna, Visc, an' the rest o' the ol' Islanders that fell along the way

Link to comment
Share on other sites

Everything else is behind the CPU tech anyway, it'd be stupid to buy fancy shiny top of the line processors.

 

I dunno, if I do buy dual videocards it'll be after newer stuff comes out, I'm strictly in the planning stage/catching up on the tech specs I've been neglecting right now.

DEADSIGS.jpg

RIP

Link to comment
Share on other sites

...well, then, let's hope that NWN2 doesna 'ave the same problems wit' Radeon gpu's as the original NeverWorkin'Right does... :p

 

 

...WHO LUVS YA, BABY!!...

A long, long time ago, but I can still remember,
How the Trolling used to make me smile.
And I knew if I had my chance, I could egg on a few Trolls to "dance",
And maybe we'd be happy for a while.
But then Krackhead left and so did Klown;
Volo and Turnip were banned, Mystake got run out o' town.
Bad news on the Front Page,
BIOweenia said goodbye in a heated rage.
I can't remember if I cried
When I heard that TORN was recently fried,
But sadness touched me deep inside,
The day...Black Isle died.


For tarna, Visc, an' the rest o' the ol' Islanders that fell along the way

Link to comment
Share on other sites

OMG I'm totally going to wastz0rs my money on it!

 

 

Though a bit of me would like to. I've been a fairly hardcore computer geek for quite some time, and have never owned a "top of the line" card :)

Edited by alanschu
Link to comment
Share on other sites

The problem is that games that take advantage of Shader Model 3 (like FEAR, Need for Speed: Most Wanted, and Oblivion, for example) demand an exponential amount of processing power from the GPU. Pre-SM3 and any nVidia 6x or 7x card could render 1600 x 1200 with ease (nVidia have announced that their SLi archtecture will be able to process PhysX calculations, for example: fat chance of there being any spare power, though). Now, even the TOP cards are incapable (singly) of rendering 1600x1200; this includes the ATi X1900XTX and the nVidia GeForce 7900GTX.

 

Even though ATi have the upper hand with their X1900XTX, AND CrossFire is now a viable option, nVidia are about to release their next generation of GPU (next couple of months). This will (most likely) have Shader Model 4, and be compatible with DirectX 10; and, although it will be ludicrously expensive and not very good value for money and you won't buy it anyhow, it will push the price of other cards down ... so maybe you could buy an SLi or CrossFire combo rig, instead ...

Fair enough. What bugs me is the fact that a couple of years back, if you wanted to play the latest games at 1600x1200, all you had to do was buy the most expensive (e.g. $400 for the Ti4600) video card. Today, if you want to play Tomb Raider, a dual-7900GTX setup worth $1200 won't give you a flawless experience. A year from now, game developers will be developing games with quad-SLI in mind. The thing is, if a quad-SLI setup would cost $500, I have no qualms.

 

I think this SLI game is ridiculous. I don't care whether I'm using 1 card or 2 cards or 15 cards. I want a graphics solution that enables me to play the latest games at the highest resolutions. How much do I have to pay for that? Over the past several years, the answers to this question have fallen on a linear-ish curve. With the coming of SLI, its suddenly starting to go exponential. $200 - $300 - $400 - $500 - $600 - $1200 - $2500... WTF?!??

Yeah, what the industry needs (if the nascent multi-card niche is to survive past it's criblife) is an interchangeable standard for SLi / CrossFire. That will bring down prices, and push up standards, when both gorrilas are on the same playing field.

Thirdly, even though AMD have the upper hand with their 64-bit CPUs, they are still using a 110nm die, whereas the new Intel Conroe are about to hit economies of scale with their new 90nm die manufacturing process (smaller die means higher speed at same frequency, and higher speeds for same power consumption/heat dissipation). The Conroe will be as fast as the Pentium M (but not as stupidly expensive) AND dual-core: which Wndows Vista will take advantage of with its 64-bit application space.

I believe all AMD and Intel processors are currently fabbed on 90nm. Conroe will be 65nm. But yes, your point is still valid, the technology shrink will enable it to be clocked higher. Also, Conroe has a bunch of non-trivial architectural improvements over the Pentium M, which make it faster clock-for-clock, especially for floating-point and SSE applications. And it's going to be plugged into a dual-channel memory system (hopefully with an improved controller). Overall, Conroe is a significant evolutionary step up from the Pentium M.

:- D'Oh!

 

You are correct: 65nm not 90nm. That'll teach me for not triple-checking.

 

Thanks. :)

 

It is the best card at the moment ... and it can run in CrossFire! (w00t)

 

Then again, I will probably buy it when the next gen cards are released, so that it drops down to about

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

You guys should get the XFX X1900XTX XXX eXtreme!!

..but alas, XFX only makes nvidia cards..

Swedes, go to: Spel2, for the latest game reviews in swedish!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...