mkreku Posted May 1, 2006 Share Posted May 1, 2006 (edited) Do girls like having sex with you, alanschu? ) Edited May 1, 2006 by mkreku Swedes, go to: Spel2, for the latest game reviews in swedish! Link to comment Share on other sites More sharing options...
alanschu Posted May 1, 2006 Share Posted May 1, 2006 No they don't Link to comment Share on other sites More sharing options...
metadigital Posted May 2, 2006 Share Posted May 2, 2006 How long do you wish to wait? Intel's Conroe is coming up in a couple of months, and (I believe) it's worth waiting for. Bokishi's right, video cards are now going through a "dark age". Nvidia is trying to milk their aging architectures for all they are worth by mindlessly bumping up pipe counts and clock speeds, while ATi are fumbling with cool new architectures that are not quite mature enough to be implemented efficiently using current technologies. <{POST_SNAPBACK}> I wouldn't exactly call it a "dark age", there is some quality hardware out there, and some of it is good value. The problem is that games that take advantage of Shader Model 3 (like FEAR, Need for Speed: Most Wanted, and Oblivion, for example) demand an exponential amount of processing power from the GPU. Pre-SM3 and any nVidia 6x or 7x card could render 1600 x 1200 with ease (nVidia have announced that their SLi archtecture will be able to process PhysX calculations, for example: fat chance of there being any spare power, though). Now, even the TOP cards are incapable (singly) of rendering 1600x1200; this includes the ATi X1900XTX and the nVidia GeForce 7900GTX. Even though ATi have the upper hand with their X1900XTX, AND CrossFire is now a viable option, nVidia are about to release their next generation of GPU (next couple of months). This will (most likely) have Shader Model 4, and be compatible with DirectX 10; and, although it will be ludicrously expensive and not very good value for money and you won't buy it anyhow, it will push the price of other cards down ... so maybe you could buy an SLi or CrossFire combo rig, instead ... Also, currently the nVidia SLi and ATi CrossFire strategies are incompatible with each other, so if you want to change your dual-graphics solution from one to the other (in the medium term) you will need a mobo upgrade, too. Thirdly, even though AMD have the upper hand with their 64-bit CPUs, they are still using a 110nm die, whereas the new Intel Conroe are about to hit economies of scale with their new 90nm die manufacturing process (smaller die means higher speed at same frequency, and higher speeds for same power consumption/heat dissipation). The Conroe will be as fast as the Pentium M (but not as stupidly expensive) AND dual-core: which Wndows Vista will take advantage of with its 64-bit application space. If I haven't scared you off with the above, then I direct you to my previous thread: Build a pc for peanuts OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS OPVS ARTIFICEM PROBAT Link to comment Share on other sites More sharing options...
Fenghuang Posted May 2, 2006 Author Share Posted May 2, 2006 Not scared, eating it up! RIP Link to comment Share on other sites More sharing options...
metadigital Posted May 2, 2006 Share Posted May 2, 2006 How much do you want to spend? The Radeon X1800XT, since the X1900 XTX came out, has dropped in price to about OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS OPVS ARTIFICEM PROBAT Link to comment Share on other sites More sharing options...
alanschu Posted May 2, 2006 Share Posted May 2, 2006 I'm probably going to be stupid and get the X1900 XTX when I upgrade my computer. I mean, 3 X's in the title...it must r0x0r!!! Link to comment Share on other sites More sharing options...
metadigital Posted May 2, 2006 Share Posted May 2, 2006 It is the best card at the moment ... and it can run in CrossFire! (w00t) Then again, I will probably buy it when the next gen cards are released, so that it drops down to about OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS OPVS ARTIFICEM PROBAT Link to comment Share on other sites More sharing options...
angshuman Posted May 2, 2006 Share Posted May 2, 2006 The problem is that games that take advantage of Shader Model 3 (like FEAR, Need for Speed: Most Wanted, and Oblivion, for example) demand an exponential amount of processing power from the GPU. Pre-SM3 and any nVidia 6x or 7x card could render 1600 x 1200 with ease (nVidia have announced that their SLi archtecture will be able to process PhysX calculations, for example: fat chance of there being any spare power, though). Now, even the TOP cards are incapable (singly) of rendering 1600x1200; this includes the ATi X1900XTX and the nVidia GeForce 7900GTX. Even though ATi have the upper hand with their X1900XTX, AND CrossFire is now a viable option, nVidia are about to release their next generation of GPU (next couple of months). This will (most likely) have Shader Model 4, and be compatible with DirectX 10; and, although it will be ludicrously expensive and not very good value for money and you won't buy it anyhow, it will push the price of other cards down ... so maybe you could buy an SLi or CrossFire combo rig, instead ... Fair enough. What bugs me is the fact that a couple of years back, if you wanted to play the latest games at 1600x1200, all you had to do was buy the most expensive (e.g. $400 for the Ti4600) video card. Today, if you want to play Tomb Raider, a dual-7900GTX setup worth $1200 won't give you a flawless experience. A year from now, game developers will be developing games with quad-SLI in mind. The thing is, if a quad-SLI setup would cost $500, I have no qualms. I think this SLI game is ridiculous. I don't care whether I'm using 1 card or 2 cards or 15 cards. I want a graphics solution that enables me to play the latest games at the highest resolutions. How much do I have to pay for that? Over the past several years, the answers to this question have fallen on a linear-ish curve. With the coming of SLI, its suddenly starting to go exponential. $200 - $300 - $400 - $500 - $600 - $1200 - $2500... WTF?!?? Thirdly, even though AMD have the upper hand with their 64-bit CPUs, they are still using a 110nm die, whereas the new Intel Conroe are about to hit economies of scale with their new 90nm die manufacturing process (smaller die means higher speed at same frequency, and higher speeds for same power consumption/heat dissipation). The Conroe will be as fast as the Pentium M (but not as stupidly expensive) AND dual-core: which Wndows Vista will take advantage of with its 64-bit application space. I believe all AMD and Intel processors are currently fabbed on 90nm. Conroe will be 65nm. But yes, your point is still valid, the technology shrink will enable it to be clocked higher. Also, Conroe has a bunch of non-trivial architectural improvements over the Pentium M, which make it faster clock-for-clock, especially for floating-point and SSE applications. And it's going to be plugged into a dual-channel memory system (hopefully with an improved controller). Overall, Conroe is a significant evolutionary step up from the Pentium M. Link to comment Share on other sites More sharing options...
alanschu Posted May 2, 2006 Share Posted May 2, 2006 It is the best card at the moment ... and it can run in CrossFire! (w00t) Then again, I will probably buy it when the next gen cards are released, so that it drops down to about Link to comment Share on other sites More sharing options...
Fenghuang Posted May 2, 2006 Author Share Posted May 2, 2006 Maybe it runs crossfire even better? Anyhow, at wholesale prices I could probably afford two of those by the time they go down in price. Hmm... A lot of this will depend on whether Calax can scramble the money to make it with Comic Con with me. If he doesn't I'll have an extra $300 or so dollars lying around. RIP Link to comment Share on other sites More sharing options...
mkreku Posted May 2, 2006 Share Posted May 2, 2006 And $300 is what the PhysX PPU is going to cost us all.. Swedes, go to: Spel2, for the latest game reviews in swedish! Link to comment Share on other sites More sharing options...
Sargallath Abraxium Posted May 2, 2006 Share Posted May 2, 2006 ...unless you plan on runnin' games like Oblivion an' such at 1600x1200, ye'd be wastin' yer money goin' fer the X1900XTX, or e'en the nVidia 7900GTX...my Sapphire X800GTO2 softmodded (all 16 pipelines enabled) an OCed ta a li'l o'er X850XT PE speeds (575 core, 615 memory) runs Oblivion just fine at 1280x960 on me ol' ViewSonic G90fB...if I was ye, I'd jus' grab an AMD dual core (if ya can afford the 4400 X2, go fer it; but the 3800 X2 be fine as well) a 7900GT on an ASUS A8N-E mobo wit' 2 gigs o' ram (OCZ be me preference there) an' let the good times roll...that is unless ya wants the "top o' the line new tech"; I canna really see Conroe or the next gen AMD offerin' makin' too much o' a difference right now, ta be perfectly honest...today's dual core processors should be good ta go fer a few years anyways... ...WHO LUVS YA, BABY!!... A long, long time ago, but I can still remember, How the Trolling used to make me smile. And I knew if I had my chance, I could egg on a few Trolls to "dance", And maybe we'd be happy for a while. But then Krackhead left and so did Klown; Volo and Turnip were banned, Mystake got run out o' town. Bad news on the Front Page, BIOweenia said goodbye in a heated rage. I can't remember if I cried When I heard that TORN was recently fried, But sadness touched me deep inside, The day...Black Isle died. For tarna, Visc, an' the rest o' the ol' Islanders that fell along the way Link to comment Share on other sites More sharing options...
Fenghuang Posted May 2, 2006 Author Share Posted May 2, 2006 Everything else is behind the CPU tech anyway, it'd be stupid to buy fancy shiny top of the line processors. I dunno, if I do buy dual videocards it'll be after newer stuff comes out, I'm strictly in the planning stage/catching up on the tech specs I've been neglecting right now. RIP Link to comment Share on other sites More sharing options...
alanschu Posted May 2, 2006 Share Posted May 2, 2006 Oh Sargy, I know it'd be a waste of money. Hence why I said I'd be stupid But 3 X's in the name OMG r0x0r speed! Link to comment Share on other sites More sharing options...
Sargallath Abraxium Posted May 2, 2006 Share Posted May 2, 2006 ...well, then, let's hope that NWN2 doesna 'ave the same problems wit' Radeon gpu's as the original NeverWorkin'Right does... ...WHO LUVS YA, BABY!!... A long, long time ago, but I can still remember, How the Trolling used to make me smile. And I knew if I had my chance, I could egg on a few Trolls to "dance", And maybe we'd be happy for a while. But then Krackhead left and so did Klown; Volo and Turnip were banned, Mystake got run out o' town. Bad news on the Front Page, BIOweenia said goodbye in a heated rage. I can't remember if I cried When I heard that TORN was recently fried, But sadness touched me deep inside, The day...Black Isle died. For tarna, Visc, an' the rest o' the ol' Islanders that fell along the way Link to comment Share on other sites More sharing options...
alanschu Posted May 2, 2006 Share Posted May 2, 2006 Hmmm, I don't remember problems. THough I didn't pick up NWN right away. I know it's OpenGL which is nVidia's thing....but I believe NWN2 shifted to DX9. Link to comment Share on other sites More sharing options...
Dark_Raven Posted May 2, 2006 Share Posted May 2, 2006 ATI is still fine with me but I am still going with GeForce in my next card. Hades was the life of the party. RIP You'll be missed. Link to comment Share on other sites More sharing options...
alanschu Posted May 2, 2006 Share Posted May 2, 2006 Still undecided. Looking for bestest bang for my buck. Link to comment Share on other sites More sharing options...
angshuman Posted May 3, 2006 Share Posted May 3, 2006 Still undecided. Looking for bestest bang for my buck. Three X's man, three X's... can't go wrong. (w00t) Link to comment Share on other sites More sharing options...
alanschu Posted May 3, 2006 Share Posted May 3, 2006 (w00t) Link to comment Share on other sites More sharing options...
Fenghuang Posted May 3, 2006 Author Share Posted May 3, 2006 THREE MOTHERF**KIN' X'S!!! RIP Link to comment Share on other sites More sharing options...
alanschu Posted May 3, 2006 Share Posted May 3, 2006 (edited) OMG I'm totally going to wastz0rs my money on it! Though a bit of me would like to. I've been a fairly hardcore computer geek for quite some time, and have never owned a "top of the line" card Edited May 3, 2006 by alanschu Link to comment Share on other sites More sharing options...
metadigital Posted May 3, 2006 Share Posted May 3, 2006 The problem is that games that take advantage of Shader Model 3 (like FEAR, Need for Speed: Most Wanted, and Oblivion, for example) demand an exponential amount of processing power from the GPU. Pre-SM3 and any nVidia 6x or 7x card could render 1600 x 1200 with ease (nVidia have announced that their SLi archtecture will be able to process PhysX calculations, for example: fat chance of there being any spare power, though). Now, even the TOP cards are incapable (singly) of rendering 1600x1200; this includes the ATi X1900XTX and the nVidia GeForce 7900GTX. Even though ATi have the upper hand with their X1900XTX, AND CrossFire is now a viable option, nVidia are about to release their next generation of GPU (next couple of months). This will (most likely) have Shader Model 4, and be compatible with DirectX 10; and, although it will be ludicrously expensive and not very good value for money and you won't buy it anyhow, it will push the price of other cards down ... so maybe you could buy an SLi or CrossFire combo rig, instead ... Fair enough. What bugs me is the fact that a couple of years back, if you wanted to play the latest games at 1600x1200, all you had to do was buy the most expensive (e.g. $400 for the Ti4600) video card. Today, if you want to play Tomb Raider, a dual-7900GTX setup worth $1200 won't give you a flawless experience. A year from now, game developers will be developing games with quad-SLI in mind. The thing is, if a quad-SLI setup would cost $500, I have no qualms. I think this SLI game is ridiculous. I don't care whether I'm using 1 card or 2 cards or 15 cards. I want a graphics solution that enables me to play the latest games at the highest resolutions. How much do I have to pay for that? Over the past several years, the answers to this question have fallen on a linear-ish curve. With the coming of SLI, its suddenly starting to go exponential. $200 - $300 - $400 - $500 - $600 - $1200 - $2500... WTF?!?? <{POST_SNAPBACK}> Yeah, what the industry needs (if the nascent multi-card niche is to survive past it's criblife) is an interchangeable standard for SLi / CrossFire. That will bring down prices, and push up standards, when both gorrilas are on the same playing field. Thirdly, even though AMD have the upper hand with their 64-bit CPUs, they are still using a 110nm die, whereas the new Intel Conroe are about to hit economies of scale with their new 90nm die manufacturing process (smaller die means higher speed at same frequency, and higher speeds for same power consumption/heat dissipation). The Conroe will be as fast as the Pentium M (but not as stupidly expensive) AND dual-core: which Wndows Vista will take advantage of with its 64-bit application space. I believe all AMD and Intel processors are currently fabbed on 90nm. Conroe will be 65nm. But yes, your point is still valid, the technology shrink will enable it to be clocked higher. Also, Conroe has a bunch of non-trivial architectural improvements over the Pentium M, which make it faster clock-for-clock, especially for floating-point and SSE applications. And it's going to be plugged into a dual-channel memory system (hopefully with an improved controller). Overall, Conroe is a significant evolutionary step up from the Pentium M. <{POST_SNAPBACK}> D'Oh! You are correct: 65nm not 90nm. That'll teach me for not triple-checking. Thanks. It is the best card at the moment ... and it can run in CrossFire! (w00t) Then again, I will probably buy it when the next gen cards are released, so that it drops down to about OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS OPVS ARTIFICEM PROBAT Link to comment Share on other sites More sharing options...
mkreku Posted May 3, 2006 Share Posted May 3, 2006 You guys should get the XFX X1900XTX XXX eXtreme!! ..but alas, XFX only makes nvidia cards.. Swedes, go to: Spel2, for the latest game reviews in swedish! Link to comment Share on other sites More sharing options...
angshuman Posted May 3, 2006 Share Posted May 3, 2006 You guys should get the XFX X1900XTX XXX eXtreme!!..but alas, XFX only makes nvidia cards.. Here's nVidia's reply to that GPU. Credit: Some random guy. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now