Jump to content

Recommended Posts

Posted

Apparently modern GPUs aren't able to keep up with all the new objects the PhysX processes and throws out on screen. Either that, or the PhysX itself isn't able to cope with the physics computations devs are currently throwing at it. Bottom line is, plug a PhysX in your machine, and your framerate goes down. Deceleration FTL. :-

Posted

Reminds me of when the GeForce supported Hardware T&L. It was slower with it on.

 

It's still a step in the right direction.

Posted

I didn't get one, because I'm waiting and seeing.

 

Hopefully my new computer is in by the end of the week though.

Posted

This might go into my next system, the one I'm currently planning. For my wife's system, I'm not opening the damned thing unless something goes wrong. When I do, it won't be to put in a physics excelerator.

 

By the time I build the upcoming machine, there will be more information on whether this is a good buy or not.

Fionavar's Holliday Wishes to all members of our online community:  Happy Holidays

 

Join the revelry at the Obsidian Plays channel:
Obsidian Plays


 
Remembering tarna, Phosphor, Metadigital, and Visceris.  Drink mead heartily in the halls of Valhalla, my friends!

Posted
I didn't get one, because I'm waiting and seeing.

 

Hopefully my new computer is in by the end of the week though.

 

I figured that Unreal 3 is supporting it and so many games are going to be using that engine, like UT2K7, Huxley, Gears of War etc.

Posted

I wouldn't call the PhysX PPU a good buy just yet. Not only is it extremely expensive, they haven't worked out the bugs and quirks in it yet. So far only ONE released game supports it too, although that is bound to change.

 

I will take a wait and see approach to this one. Rumours say that Gothic 3 might take advantage of it though, which is when I would consider buying one.

Swedes, go to: Spel2, for the latest game reviews in swedish!

Posted

 

 

I wish Bioshock was on that list. It seems to me like it's the type of game that could really benefit from enhanced physics.

"My hovercraft is full of eels!" - Hungarian tourist
I am Dan Quayle of the Romans.
I want to tattoo a map of the Netherlands on my nether lands.
Heja Sverige!!
Everyone should cuffawkle more.
The wrench is your friend. :bat:

Posted

When all the games using UT3 are released, that will be when the PhysX card will become useful.

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Posted

First see if UE3 (yeah, UE, from Unreal Engine :rolleyes:") will not be released around the time Nvidia and ATi made their cards that way that PhysX is not even needed anymore... (for an cheaper price... well actually a bit more expensive, but... seeing as you also get a GPU improvement...)

^

 

 

I agree that that is such a stupid idiotic pathetic garbage hateful retarded scumbag evil satanic nazi like term ever created. At least top 5.

 

TSLRCM Official Forum || TSLRCM Moddb || My other KOTOR2 mods || TSLRCM (English version) on Steam || [M4-78EP on Steam

Formerly known as BattleWookiee/BattleCookiee

Posted

True. It's not so clear to me that this option will gain mainstream support, as opposed to nVidia's solution, something else, or nothing altogether. The Voodoo was preceeded by the Rendition Verite...and that didn't last. I would suggest the Rendition was more compelling (for its handful of titles) than a card that offers prettier explosions at the potential cost of poorer performance.

 

We'll see.

Posted

When frame rates are already really high, people will be more willing to add the pretty explosions.

 

3dfx gambled that no one would be interested in a GeForce card that ran games slower with hardware T&L, but they did.

Posted

I don't think 3DFx ever released a quad-SLI card though. I have the 5500 version with two GPU's on it (HUUUUGE card) but I think the next model (6500?) would have used four cores. 3DFx died before they could release it though.

 

Of course, I might remember wrong, but I'm at work so someone else google the crap out of it if you're interested in real facts :))

Swedes, go to: Spel2, for the latest game reviews in swedish!

Posted

From what I have read it's definitely wait and see as far as the Aegia processor goes.

 

1. There's only one released title that supports it and the PPU doesn't do a lot for it, by all accounts (GRAW).

 

2. There are alternative technoliges that may achieve market dominance (eg the Nvidia/Havok solution).

 

3. Early investors in new tech always pay a premium.

 

4. If it's anything like GPUs a new generation will be out shortly.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...