Jump to content

Recommended Posts

Posted
8800 GTX is 384-bit, i think it's good

No. 768MB. :huh:

 

if ur trying to impress me with your grafix knowledge then u better try a little harder :)

A nerd like you should be going for the 768MB GeForce 8800 Ultra. Thats the uber graphics card.

 

At least you know what teh uber card is. And no, you can't call me a nerd without my permission

Posted
If you really don't even want to get into fillrates, but want a single number (based on specs, not benchmark runs) to compare and get an instant idea, then based on the way GPUs and games have evolved in the past few years, I'd say you want to look at one metric: Memory Bandwidth. Simply multiply the bit-width of the memory bus (e.g., 384-bits for the 8800GTX) with the memory clock (e.g., 1800MHz for the GTX) and you'll have a pretty good idea of the market bracket the card falls into.

The disappointment that was the ATI Radeon 2900XT has a 512 bit interface and the memory is clocked at 1.65GHz. It's bandwidth beats that of the 8800GTX. On paper that GPU should have been a beast. I still don't understand what holds it back..

Optimized TWIMTBP code.

if (gpu.vendorID == "ATi")

sleep(1000000);

:huh:

Posted
8800 GTX is 384-bit, i think it's good

No. 768MB. :thumbsup:

 

if ur trying to impress me with your grafix knowledge then u better try a little harder :*

A nerd like you should be going for the 768MB GeForce 8800 Ultra. Thats the uber graphics card.

 

At least you know what teh uber card is. And no, you can't call me a nerd without my permission

Maybe I will get that bad boy when the prices drop. Thats the problem with technology. Rich nerds go out and waste spend an ungodly sum of money on something that loses its value within 6-12 months because something awesome and better came out to replace it.

2010spaceships.jpg

Hades was the life of the party. RIP You'll be missed.

Posted
8800 GTX is 384-bit, it think it's good

No. 768MB. :)

 

Wha... Are you joking or did you just do the noob mistake?

This post is not to be enjoyed, discussed, or referenced on company time.

Posted

384-bit... isn't that technically impossible?

 

I'm a total newb, but I thought the best was 64-bit, and any bigger is indicipherable...

"Alright, I've been thinking. When life gives you lemons, don't make lemonade - make life take the lemons back! Get mad! I don't want your damn lemons, what am I supposed to do with these? Demand to see life's manager. Make life rue the day it thought it could give Cave Johnson lemons. Do you know who I am? I'm the man who's gonna burn your house down! With the lemons. I'm going to to get my engineers to invent a combustible lemon that burns your house down!"

Posted

They meant megabytes, not bits. and those numbers are strange, only the GTS version of the 8800 series have them. I don't remember why they chose to go out of uniform, but apparently it worked just fine.

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

Posted

384 bit is the width of the memory interface. 768 megabyte is the amount of memory. Hope this clarifies that a bit :)

Citizen of a country with a racist, hypocritical majority

Posted

Heya all, I'm planning a semi-budget but decent rig. My target's low now with the option for upgrading later (I'll be getting a good Mobo?).

 

I don't play the latest graphics intensive games like Bioshock etc, mostly RPGs like NWN2 etc, maybe Aliens if it's on PC. What kinda juice should I be aiming at?

Spreading beauty with my katana.

Posted

The 8800GT is about $280 Its the best value around. The 2nd best value is the ATI 3870, its runs about $240. And the 3850 which is about $200 i believe. If you want to wait, then you can always wait and buy a better card.

 

The toughest Question is, what Mobo are you going to invest in?

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

Posted
That's hard to believe, I'm expecting a full power flagship 9800 GTX as their first 9 series launch card. That's how it's always been in the past

I doubt it will be D9 as their first "full Power Flagship", because I've been reading from multiple sources it will be D8 architecture still. Why they want to do that I'm unsure. IMO its because they have time to slow down due to ATI. So we the consumers lose out somewhat.

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

Posted
How can the first 9 series card be a GX2? Is the 9-series GPU really that weak that they need two of them?

I doubt that website is credible. Although the picture of the ATI X2. With the 2 processors on the same PCB is real i think.

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

Posted

I don't think Bok gets turned on by ATI at the moment.. or ever.

Swedes, go to: Spel2, for the latest game reviews in swedish!

Posted

The graphic cards above have Dual Processors on 1 PCB. Notice the 2 tension clips on the PCB, 1 for each processor. I would rather they follow AMD, or Intels lead, by just making multi core processors, instead of plopping 2 down in one PCB, but its s step above slapping 2 PCB's together and shoving it in 1 slot. The 7900GTX2 had mega heat, and horrible drivers, from what I've read.

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

Posted
The graphic cards above have Dual Processors on 1 PCB. Notice the 2 tension clips on the PCB, 1 for each processor. I would rather they follow AMD, or Intels lead, by just making multi core processors, instead of plopping 2 down in one PCB, but its s step above slapping 2 PCB's together and shoving it in 1 slot. The 7900GTX2 had mega heat, and horrible drivers, from what I've read.

It all has to do with manufaturing and off-chip bandwidth.

 

One ugly non-scalable physical constraint is the number of pins on a chip, which is largely a function of chip surface area. A single multi-core chip with as much pin-bandwidth as two smaller chips will end up being as expensive in silicon real-estate as the dual-chip solution, but will have much lower manufacturing yields. Similar story with PCBs: you can only route so many wires from the GPU to the memory on a single PCB.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...