Jump to content

Welcome to Obsidian Forum Community
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

POST YOUR SPECS


  • Please log in to reply
642 replies to this topic

#41
mr insomniac

mr insomniac

    (7) Enchanter

  • Members
  • 982 posts
  • Backer
Celeron-D CPU 2.8 Ghz
Asus P4S8X-MX motherboard
1 Gb RAM
Nvidia GeForce 6200 256 Mb
Integrated sound
40 Gb IDE HD
2 x 120 Gb IDE Hds

#42
alanschu

alanschu

    Arch-Mage

  • Members
  • 15734 posts
  • Location:Edmonton, Alberta, Canada
  • Backer
  • Kickstarter Backer

Good news for AMD lovers... in 2007 AMD will release the AMD 64 FX-66 CPU, a dual core 3,2 Ghz processor!
Can't wait for it!

<{POST_SNAPBACK}>


Ok... don't they already have that? Because I have a dual core 3.2 Ghz processor from Intel, and it wasn't even the best one. And I heard the AMD dual core ones were better.

<{POST_SNAPBACK}>


Raw MHz doesn't mean very much. AMD's processors for the most part could perform as good as Intel's despite being at a lower clock speed.

It's more a difference in design philosophy though. Intel made a chip that they could ramp up the clock rates with.

#43
Kaftan Barlast

Kaftan Barlast

    Obsidian VIP

  • Members
  • 4737 posts
  • Location:In the middle of the night when everything was still, she heard clawing and gnawing, nibbling and squabbling. She could hear the wolves in the walls, plotting their wolfish plots, hatching their wolfish schemes

1.
AMD 64 "Vencie" 3200+ overclocked to 3500+
BFG Gf7900GTX 512mb overclocked
(some known brand)2x512mb 400MhZ overclocked to 440mHz in dual-channel
Asus NForce4 A8N-e mobo
160gb SATA
250gb SATA
100gb IDE

2. Acer Amd64 3000+ laptop

3. Amd Xp 2600+ running Kubuntu Linux v.6

<{POST_SNAPBACK}>



or you can buy a Gf7300 for liek 50€

#44
jaguars4ever

jaguars4ever

    Arch-Mage

  • Members
  • 7093 posts
  • Location:Salarian Special Tasks Group
Poor Kaft shells out full price for a 7900GTX, and then the next card Nvidia releases is literally double the power. :(

#45
Kaftan Barlast

Kaftan Barlast

    Obsidian VIP

  • Members
  • 4737 posts
  • Location:In the middle of the night when everything was still, she heard clawing and gnawing, nibbling and squabbling. She could hear the wolves in the walls, plotting their wolfish plots, hatching their wolfish schemes
The 7950GX2 doesnt double the power. It only faster than the 7900GTX in resolutions above 1280x1024 and my monitor cant do higher, so it was poinless to get one.

#46
jaguars4ever

jaguars4ever

    Arch-Mage

  • Members
  • 7093 posts
  • Location:Salarian Special Tasks Group

The 7950GX2 doesnt double the power. It only faster than the 7900GTX in resolutions above 1280x1024 and my monitor cant do higher, so it was poinless to get one.

<{POST_SNAPBACK}>

Fine, fine...more like one and a half times the power!

76.8 GB/s memory bandwidth vs 51.2 GB/s, 24000 MTexels/s fill rate vs 15600, 8 x 2 vertex piplines vs 8, 24 x 2 pixel piplelines vs 24, 48 x 2 pixel shaders vs 48...

Seems worth it to me, Kaft. :cool:

#47
Kaftan Barlast

Kaftan Barlast

    Obsidian VIP

  • Members
  • 4737 posts
  • Location:In the middle of the night when everything was still, she heard clawing and gnawing, nibbling and squabbling. She could hear the wolves in the walls, plotting their wolfish plots, hatching their wolfish schemes
It only makes sense if you play in resolutions of 1600x1200 and above with AF+AA. At 1280x1024 which is the native resolutions of most monitors today, the two are exactly the same. It would just mean a useless excess in horsepower at the cost of increased noise levels and having to buy one hell of a PSU

#48
alanschu

alanschu

    Arch-Mage

  • Members
  • 15734 posts
  • Location:Edmonton, Alberta, Canada
  • Backer
  • Kickstarter Backer
Sounds like a CPU issue.

#49
Alec

Alec

    (8) Warlock

  • Members.
  • 1107 posts
Just get a 4.1 Ghz Overclocked CPU and you'll be fine. HA HA HA

#50
Kaftan Barlast

Kaftan Barlast

    Obsidian VIP

  • Members
  • 4737 posts
  • Location:In the middle of the night when everything was still, she heard clawing and gnawing, nibbling and squabbling. She could hear the wolves in the walls, plotting their wolfish plots, hatching their wolfish schemes

Sounds like a CPU issue.

<{POST_SNAPBACK}>



No, its the same result across the board on all the benchmarks Ive seen. Tom's Hardware tested it with a FX-67 or whatever theyre called so there's no bottleneck there.


Anyway, Ill be selling my 7900GTX once theres a decent Gf8000 series card out and get that instead. It wasnt really meant as anything but a temporary solution when my old card broke, until I could get a Geforce8 but I didnt want to be stuck with a budget card for NWN2 so I got a real one

Edited by Kaftan Barlast, 31 July 2006 - 01:39 PM.


#51
alanschu

alanschu

    Arch-Mage

  • Members
  • 15734 posts
  • Location:Edmonton, Alberta, Canada
  • Backer
  • Kickstarter Backer
The reason why it outperforms at higher resolutions is because it has enough fillrate and power to make graphics less of a "bottleneck." If it's not winning at lower resolutions, it's because something else is the bottleneck.

Unless the game has capped the framerate.

#52
Kaftan Barlast

Kaftan Barlast

    Obsidian VIP

  • Members
  • 4737 posts
  • Location:In the middle of the night when everything was still, she heard clawing and gnawing, nibbling and squabbling. She could hear the wolves in the walls, plotting their wolfish plots, hatching their wolfish schemes
I just dont think that two slower GPU's can outperform a single, faster one at low resolutions. The 7900GTX is actually faster than the GX2 at 1024x768 and lower

#53
alanschu

alanschu

    Arch-Mage

  • Members
  • 15734 posts
  • Location:Edmonton, Alberta, Canada
  • Backer
  • Kickstarter Backer
What exactly does the nVidia "SLI" do?

Back in the old days, the Voodoo 2 SLI kicked the crap out of any single Voodoo2 at any resolution.

I imagine nVidia's isn't the old style Scan Line Interleave, as video cards have so much more to do than just render textures now-a-days. It makes no sense why two cards would perform slower than a single one, unless there implementation philosophy is different, or the product just isn't running optimally.

Bandwidth limitations? Though it's not like the old PCI bus that the Voodoo2s ran in were super fast either...

#54
angshuman

angshuman

    (6) Magician

  • Members
  • 679 posts
Dunno... dual cards have got to have *some* overheads somewhere that show up when both GPUs are not being fully utilized... that's probably when the sheer speed of one GTX trumps the massive width of the two cards. One scenario I can imagine involves memory latency: the GTX's memory is much higher clocked than the GX2's. So, if mem bandwidth is not getting hammered, then perhaps the GTX's lower latency yields benefits somewhere.

#55
jaguars4ever

jaguars4ever

    Arch-Mage

  • Members
  • 7093 posts
  • Location:Salarian Special Tasks Group

Dunno... dual cards have got to have *some* overheads somewhere that show up when both GPUs are not being fully utilized... that's probably when the sheer speed of one GTX trumps the massive width of the two cards. One scenario I can imagine involves memory latency: the GTX's memory is much higher clocked than the GX2's. So, if mem bandwidth is not getting hammered, then perhaps the GTX's lower latency yields benefits somewhere.

<{POST_SNAPBACK}>

Fortunately memory can be overclocked. I have my 6800XT OC'd to 800 Mhz, so I could hazard a guess as to what the GX2's can be clocked to. I'd be curious to know what results that would yield.

#56
angshuman

angshuman

    (6) Magician

  • Members
  • 679 posts

Fortunately memory can be overclocked. I have my 6800XT OC'd to 800 Mhz, so I could hazard a guess as to what the GX2's can be clocked to.  I'd be curious to know what results that would yield.

Well, that's just a 100MHz overclock. A GTX's memory runs at 1600MHz while a GX2's runs at 1200MHz. The cooling on a GX2 is a joke. I doubt either its core or its memory would survive any amount of overclocking. The card was designed from the start to be a slightly slow but gargantuan monster, while the GTX is more of a lean and mean machine. :)

But then, as benchmarks show, the GX2 almost always trounces the GTX in most real-world situations. What would be interesting is a comparison between Quad-SLI GX2's (once the drivers are mature enough) and a pair of SLI'd GTX's. I have a strong suspicion the GTX's will win.

#57
jaguars4ever

jaguars4ever

    Arch-Mage

  • Members
  • 7093 posts
  • Location:Salarian Special Tasks Group
How much more do you think I can OC my card before it gets "too hot to handle," Angs?

#58
alanschu

alanschu

    Arch-Mage

  • Members
  • 15734 posts
  • Location:Edmonton, Alberta, Canada
  • Backer
  • Kickstarter Backer

Dunno... dual cards have got to have *some* overheads somewhere that show up when both GPUs are not being fully utilized... that's probably when the sheer speed of one GTX trumps the massive width of the two cards. One scenario I can imagine involves memory latency: the GTX's memory is much higher clocked than the GX2's. So, if mem bandwidth is not getting hammered, then perhaps the GTX's lower latency yields benefits somewhere.

<{POST_SNAPBACK}>



Well this would make sense, as you've started to change other variables.

#59
angshuman

angshuman

    (6) Magician

  • Members
  • 679 posts

How much more do you think I can OC my card before it gets "too hot to handle," Angs?

If you have a decent heatsink on your memory modules, you could probably try and take it up to 900, but I don't think I'm the right person to answer that question... I have little to no experience overclocking video RAM. What I meant by the "just 100MHz" in my previous post was that it was a small overclock compared to what you would have to achieve to get the GX2 up to the GTX's clock speed (+400MHz). In fact, 100MHz actually sounds like a nice overclock for 700MHz modules.

#60
Tuerkek

Tuerkek

    (1) Prestidigitator

  • Members
  • 11 posts
Current specs:

AMD Sempron 3000+ 2.0 ghz
512 MB DDR333
ASUS A7V8X-LA, socket A, 2 GB max (333), AGP 8x/4x, generic cooling fan + HS, 250 W PSU

256 MB ATI X700 Pro AGP 8X
120 GB Samsung + 40 GB Maxtor
DVD-RW 16x, DL

Pending specs (hardware is en route):

AMD Athlon 64 3400+ 2.4 ghz (estimated OCable @ 2880 MHz)
1 GB DDR 400 Corsair ValueSelect
Gigabyte GA-K8VM800M, Socket 754, 2GB max (400), AGP 8x/4x, Arctic Cooling freezer fan + HS 40 CFM, 300 W PSU w/+3.3@18A,+5V@18A,+12V1@8A,+12V2@14.5A,-12V@0.5A,+5VSB@2.0A

256 MB ATI X700 Pro AGP 8X
120 GB Samsung + 40 GB Maxtor
DVD-RW 16x, DL

Upgrade price $297.90.


Laptop (rofl, 6fps on kotor I/II):
--Microsoft® Windows® XP Home Edition with SP2

--AMD Athlon 64 3200+ (2.0GHz/512KB L2 Cache)

--15.4" WXGA BrightView Widescreen (1280x800)

--128MB ATI RADEON XPRESS 200M w/Hypermemory

--1.0GB DDR333

-- 80 GB 4200 RPM Hard Drive

--DVD+/-RW/R & CD-RW Combo w/Double Layer Support

--54g™ 802.11b/g WLAN w/ 125HSM/SpeedBooster™

--8 Cell Lithium Ion Battery

Some stats on this. At minimum settings, 40 fps max on far cry, 20-80 fps on JKA, 22.7 fps on the CS:S stress test, a max of 20-25 fps on SWBFII, and 6 fps on KOTOR--1 or 2, about the same. (this is for the laptop)

Edited by Tuerkek, 01 August 2006 - 09:10 PM.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users