Jump to content
View in the app

A better way to browse. Learn more.

Obsidian Forum Community

A full-screen app on your home screen with push notifications, badges and more.

To install this app on iOS and iPadOS
  1. Tap the Share icon in Safari
  2. Scroll the menu and tap Add to Home Screen.
  3. Tap Add in the top-right corner.
To install this app on Android
  1. Tap the 3-dot menu (⋮) in the top-right corner of the browser.
  2. Tap Add to Home screen or Install app.
  3. Confirm by tapping Install.

Best Graphics Cards at Present and in the Future

Featured Replies

8800 GTX is 384-bit, i think it's good

No. 768MB. :huh:

 

if ur trying to impress me with your grafix knowledge then u better try a little harder :)

A nerd like you should be going for the 768MB GeForce 8800 Ultra. Thats the uber graphics card.

 

At least you know what teh uber card is. And no, you can't call me a nerd without my permission

If you really don't even want to get into fillrates, but want a single number (based on specs, not benchmark runs) to compare and get an instant idea, then based on the way GPUs and games have evolved in the past few years, I'd say you want to look at one metric: Memory Bandwidth. Simply multiply the bit-width of the memory bus (e.g., 384-bits for the 8800GTX) with the memory clock (e.g., 1800MHz for the GTX) and you'll have a pretty good idea of the market bracket the card falls into.

The disappointment that was the ATI Radeon 2900XT has a 512 bit interface and the memory is clocked at 1.65GHz. It's bandwidth beats that of the 8800GTX. On paper that GPU should have been a beast. I still don't understand what holds it back..

Optimized TWIMTBP code.

if (gpu.vendorID == "ATi")

sleep(1000000);

:huh:

8800 GTX is 384-bit, i think it's good

No. 768MB. :thumbsup:

 

if ur trying to impress me with your grafix knowledge then u better try a little harder :*

A nerd like you should be going for the 768MB GeForce 8800 Ultra. Thats the uber graphics card.

 

At least you know what teh uber card is. And no, you can't call me a nerd without my permission

Maybe I will get that bad boy when the prices drop. Thats the problem with technology. Rich nerds go out and waste spend an ungodly sum of money on something that loses its value within 6-12 months because something awesome and better came out to replace it.

2010spaceships.jpg

Hades was the life of the party. RIP You'll be missed.

But you get the greatest "now" which is what to pay money for

8800 GTX is 384-bit, it think it's good

No. 768MB. :)

 

Wha... Are you joking or did you just do the noob mistake?

This post is not to be enjoyed, discussed, or referenced on company time.

8800 GTX is 384-bit, it think it's good

No. 768MB. :)

 

Wha... Are you joking or did you just do the noob mistake?

graphics n00b :*

384-bit... isn't that technically impossible?

 

I'm a total newb, but I thought the best was 64-bit, and any bigger is indicipherable...

"Alright, I've been thinking. When life gives you lemons, don't make lemonade - make life take the lemons back! Get mad! I don't want your damn lemons, what am I supposed to do with these? Demand to see life's manager. Make life rue the day it thought it could give Cave Johnson lemons. Do you know who I am? I'm the man who's gonna burn your house down! With the lemons. I'm going to to get my engineers to invent a combustible lemon that burns your house down!"

They meant megabytes, not bits. and those numbers are strange, only the GTS version of the 8800 series have them. I don't remember why they chose to go out of uniform, but apparently it worked just fine.

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

384 bit is the width of the memory interface. 768 megabyte is the amount of memory. Hope this clarifies that a bit :)

Citizen of a country with a racist, hypocritical majority

Yes memory. 768MBs.

2010spaceships.jpg

Hades was the life of the party. RIP You'll be missed.

Heya all, I'm planning a semi-budget but decent rig. My target's low now with the option for upgrading later (I'll be getting a good Mobo?).

 

I don't play the latest graphics intensive games like Bioshock etc, mostly RPGs like NWN2 etc, maybe Aliens if it's on PC. What kinda juice should I be aiming at?

Spreading beauty with my katana.

The 8800GT is about $280 Its the best value around. The 2nd best value is the ATI 3870, its runs about $240. And the 3850 which is about $200 i believe. If you want to wait, then you can always wait and buy a better card.

 

The toughest Question is, what Mobo are you going to invest in?

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

That's hard to believe, I'm expecting a full power flagship 9800 GTX as their first 9 series launch card. That's how it's always been in the past

That's hard to believe, I'm expecting a full power flagship 9800 GTX as their first 9 series launch card. That's how it's always been in the past

I doubt it will be D9 as their first "full Power Flagship", because I've been reading from multiple sources it will be D8 architecture still. Why they want to do that I'm unsure. IMO its because they have time to slow down due to ATI. So we the consumers lose out somewhat.

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

Thanks Withteeth! I'll check them out and decide which mobo I'm going with.

Spreading beauty with my katana.

How can the first 9 series card be a GX2? Is the 9-series GPU really that weak that they need two of them?

How can the first 9 series card be a GX2? Is the 9-series GPU really that weak that they need two of them?

I doubt that website is credible. Although the picture of the ATI X2. With the 2 processors on the same PCB is real i think.

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

  • Author
The 3870 X2

r680-installed.jpg Notice anything different?

 

Actually, there are some visible differences. For example, the green panel is longer on the top card than on the bottom one.

The 3870 X2

r680-installed.jpg Notice anything different?

You really shouldn't have posted that.

 

Bokishi is trying to cut down on his pr0n. :)

manthing2.jpg

I don't think Bok gets turned on by ATI at the moment.. or ever.

Swedes, go to: Spel2, for the latest game reviews in swedish!

The graphic cards above have Dual Processors on 1 PCB. Notice the 2 tension clips on the PCB, 1 for each processor. I would rather they follow AMD, or Intels lead, by just making multi core processors, instead of plopping 2 down in one PCB, but its s step above slapping 2 PCB's together and shoving it in 1 slot. The 7900GTX2 had mega heat, and horrible drivers, from what I've read.

Always outnumbered, never out gunned!

Unreal Tournament 2004 Handle:Enlight_2.0

Myspace Website!

My rig

The graphic cards above have Dual Processors on 1 PCB. Notice the 2 tension clips on the PCB, 1 for each processor. I would rather they follow AMD, or Intels lead, by just making multi core processors, instead of plopping 2 down in one PCB, but its s step above slapping 2 PCB's together and shoving it in 1 slot. The 7900GTX2 had mega heat, and horrible drivers, from what I've read.

It all has to do with manufaturing and off-chip bandwidth.

 

One ugly non-scalable physical constraint is the number of pins on a chip, which is largely a function of chip surface area. A single multi-core chip with as much pin-bandwidth as two smaller chips will end up being as expensive in silicon real-estate as the dual-chip solution, but will have much lower manufacturing yields. Similar story with PCBs: you can only route so many wires from the GPU to the memory on a single PCB.

Create an account or sign in to comment

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.