Bokishi Posted December 20, 2007 Share Posted December 20, 2007 8800 GTX is 384-bit, i think it's good No. 768MB. if ur trying to impress me with your grafix knowledge then u better try a little harder A nerd like you should be going for the 768MB GeForce 8800 Ultra. Thats the uber graphics card. At least you know what teh uber card is. And no, you can't call me a nerd without my permission Current 3DMark Link to comment Share on other sites More sharing options...
angshuman Posted December 20, 2007 Share Posted December 20, 2007 If you really don't even want to get into fillrates, but want a single number (based on specs, not benchmark runs) to compare and get an instant idea, then based on the way GPUs and games have evolved in the past few years, I'd say you want to look at one metric: Memory Bandwidth. Simply multiply the bit-width of the memory bus (e.g., 384-bits for the 8800GTX) with the memory clock (e.g., 1800MHz for the GTX) and you'll have a pretty good idea of the market bracket the card falls into. The disappointment that was the ATI Radeon 2900XT has a 512 bit interface and the memory is clocked at 1.65GHz. It's bandwidth beats that of the 8800GTX. On paper that GPU should have been a beast. I still don't understand what holds it back.. Optimized TWIMTBP code. if (gpu.vendorID == "ATi") sleep(1000000); Link to comment Share on other sites More sharing options...
Dark_Raven Posted December 20, 2007 Share Posted December 20, 2007 8800 GTX is 384-bit, i think it's good No. 768MB. if ur trying to impress me with your grafix knowledge then u better try a little harder A nerd like you should be going for the 768MB GeForce 8800 Ultra. Thats the uber graphics card. At least you know what teh uber card is. And no, you can't call me a nerd without my permission Maybe I will get that bad boy when the prices drop. Thats the problem with technology. Rich nerds go out and waste spend an ungodly sum of money on something that loses its value within 6-12 months because something awesome and better came out to replace it. Hades was the life of the party. RIP You'll be missed. Link to comment Share on other sites More sharing options...
Bokishi Posted December 20, 2007 Share Posted December 20, 2007 But you get the greatest "now" which is what to pay money for Current 3DMark Link to comment Share on other sites More sharing options...
kirottu Posted December 21, 2007 Share Posted December 21, 2007 8800 GTX is 384-bit, it think it's good No. 768MB. Wha... Are you joking or did you just do the noob mistake? This post is not to be enjoyed, discussed, or referenced on company time. Link to comment Share on other sites More sharing options...
Bokishi Posted December 22, 2007 Share Posted December 22, 2007 8800 GTX is 384-bit, it think it's good No. 768MB. Wha... Are you joking or did you just do the noob mistake? graphics n00b Current 3DMark Link to comment Share on other sites More sharing options...
WILL THE ALMIGHTY Posted December 22, 2007 Share Posted December 22, 2007 384-bit... isn't that technically impossible? I'm a total newb, but I thought the best was 64-bit, and any bigger is indicipherable... "Alright, I've been thinking. When life gives you lemons, don't make lemonade - make life take the lemons back! Get mad! I don't want your damn lemons, what am I supposed to do with these? Demand to see life's manager. Make life rue the day it thought it could give Cave Johnson lemons. Do you know who I am? I'm the man who's gonna burn your house down! With the lemons. I'm going to to get my engineers to invent a combustible lemon that burns your house down!" Link to comment Share on other sites More sharing options...
WITHTEETH Posted December 23, 2007 Share Posted December 23, 2007 They meant megabytes, not bits. and those numbers are strange, only the GTS version of the 8800 series have them. I don't remember why they chose to go out of uniform, but apparently it worked just fine. Always outnumbered, never out gunned! Unreal Tournament 2004 Handle:Enlight_2.0 Myspace Website! My rig Link to comment Share on other sites More sharing options...
samm Posted December 23, 2007 Share Posted December 23, 2007 384 bit is the width of the memory interface. 768 megabyte is the amount of memory. Hope this clarifies that a bit Citizen of a country with a racist, hypocritical majority Link to comment Share on other sites More sharing options...
Dark_Raven Posted December 23, 2007 Share Posted December 23, 2007 Yes memory. 768MBs. Hades was the life of the party. RIP You'll be missed. Link to comment Share on other sites More sharing options...
Atreides Posted December 23, 2007 Share Posted December 23, 2007 Heya all, I'm planning a semi-budget but decent rig. My target's low now with the option for upgrading later (I'll be getting a good Mobo?). I don't play the latest graphics intensive games like Bioshock etc, mostly RPGs like NWN2 etc, maybe Aliens if it's on PC. What kinda juice should I be aiming at? Spreading beauty with my katana. Link to comment Share on other sites More sharing options...
WITHTEETH Posted December 23, 2007 Share Posted December 23, 2007 The 8800GT is about $280 Its the best value around. The 2nd best value is the ATI 3870, its runs about $240. And the 3850 which is about $200 i believe. If you want to wait, then you can always wait and buy a better card. The toughest Question is, what Mobo are you going to invest in? Always outnumbered, never out gunned! Unreal Tournament 2004 Handle:Enlight_2.0 Myspace Website! My rig Link to comment Share on other sites More sharing options...
WITHTEETH Posted December 23, 2007 Share Posted December 23, 2007 Expreview: 9600GT to be the first product of Geforce 9 series Not sure how dependable this site is though. Always outnumbered, never out gunned! Unreal Tournament 2004 Handle:Enlight_2.0 Myspace Website! My rig Link to comment Share on other sites More sharing options...
Bokishi Posted December 23, 2007 Share Posted December 23, 2007 That's hard to believe, I'm expecting a full power flagship 9800 GTX as their first 9 series launch card. That's how it's always been in the past Current 3DMark Link to comment Share on other sites More sharing options...
WITHTEETH Posted December 23, 2007 Share Posted December 23, 2007 That's hard to believe, I'm expecting a full power flagship 9800 GTX as their first 9 series launch card. That's how it's always been in the past I doubt it will be D9 as their first "full Power Flagship", because I've been reading from multiple sources it will be D8 architecture still. Why they want to do that I'm unsure. IMO its because they have time to slow down due to ATI. So we the consumers lose out somewhat. Always outnumbered, never out gunned! Unreal Tournament 2004 Handle:Enlight_2.0 Myspace Website! My rig Link to comment Share on other sites More sharing options...
Atreides Posted December 24, 2007 Share Posted December 24, 2007 Thanks Withteeth! I'll check them out and decide which mobo I'm going with. Spreading beauty with my katana. Link to comment Share on other sites More sharing options...
WITHTEETH Posted December 24, 2007 Share Posted December 24, 2007 GeForce 9800 GX2 coming on Feb, 19th This defies what i said earlier about the D9 high end not coming out soon, i hope this artical is true, but not sure how dependable this site is once again. Always outnumbered, never out gunned! Unreal Tournament 2004 Handle:Enlight_2.0 Myspace Website! My rig Link to comment Share on other sites More sharing options...
WITHTEETH Posted December 24, 2007 Share Posted December 24, 2007 The 3870 X2 Notice anything different? Always outnumbered, never out gunned! Unreal Tournament 2004 Handle:Enlight_2.0 Myspace Website! My rig Link to comment Share on other sites More sharing options...
Bokishi Posted December 24, 2007 Share Posted December 24, 2007 How can the first 9 series card be a GX2? Is the 9-series GPU really that weak that they need two of them? Current 3DMark Link to comment Share on other sites More sharing options...
WITHTEETH Posted December 24, 2007 Share Posted December 24, 2007 How can the first 9 series card be a GX2? Is the 9-series GPU really that weak that they need two of them? I doubt that website is credible. Although the picture of the ATI X2. With the 2 processors on the same PCB is real i think. Always outnumbered, never out gunned! Unreal Tournament 2004 Handle:Enlight_2.0 Myspace Website! My rig Link to comment Share on other sites More sharing options...
Magister Lajciak Posted December 25, 2007 Author Share Posted December 25, 2007 The 3870 X2 Notice anything different? Actually, there are some visible differences. For example, the green panel is longer on the top card than on the bottom one. Link to comment Share on other sites More sharing options...
jaguars4ever Posted December 27, 2007 Share Posted December 27, 2007 The 3870 X2 Notice anything different? You really shouldn't have posted that. Bokishi is trying to cut down on his pr0n. Link to comment Share on other sites More sharing options...
mkreku Posted December 30, 2007 Share Posted December 30, 2007 I don't think Bok gets turned on by ATI at the moment.. or ever. Swedes, go to: Spel2, for the latest game reviews in swedish! Link to comment Share on other sites More sharing options...
WITHTEETH Posted December 31, 2007 Share Posted December 31, 2007 The graphic cards above have Dual Processors on 1 PCB. Notice the 2 tension clips on the PCB, 1 for each processor. I would rather they follow AMD, or Intels lead, by just making multi core processors, instead of plopping 2 down in one PCB, but its s step above slapping 2 PCB's together and shoving it in 1 slot. The 7900GTX2 had mega heat, and horrible drivers, from what I've read. Always outnumbered, never out gunned! Unreal Tournament 2004 Handle:Enlight_2.0 Myspace Website! My rig Link to comment Share on other sites More sharing options...
angshuman Posted January 3, 2008 Share Posted January 3, 2008 The graphic cards above have Dual Processors on 1 PCB. Notice the 2 tension clips on the PCB, 1 for each processor. I would rather they follow AMD, or Intels lead, by just making multi core processors, instead of plopping 2 down in one PCB, but its s step above slapping 2 PCB's together and shoving it in 1 slot. The 7900GTX2 had mega heat, and horrible drivers, from what I've read. It all has to do with manufaturing and off-chip bandwidth. One ugly non-scalable physical constraint is the number of pins on a chip, which is largely a function of chip surface area. A single multi-core chip with as much pin-bandwidth as two smaller chips will end up being as expensive in silicon real-estate as the dual-chip solution, but will have much lower manufacturing yields. Similar story with PCBs: you can only route so many wires from the GPU to the memory on a single PCB. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now