Jump to content

Recommended Posts

Posted
Yes, there is a part that is done in the hardware, but some of it is done in the code (in fact you can do occlusion queries in Direct3D9).

 

Anyway it was just a guess, I think here PhysX itself is at fault.

 

True, and I think so aswell.

You guys are probably right... but I just *can't* convince myself unless I see results showing exactly where the bottleneck lies. The Aegia folks have some really clever people with them (I know of at least one). It's hard for me to imagine that they would bring out a product that is not very good at doing exactly what it was designed to do. But then again, products like that Killer NIC come out time and again, and shatter all my delusions about Mankind's competence.

Posted

Well, don't forget that the original GeForce wasn't very good at Hardware T&L either. In many cases you could still get faster performance by letting the processor deal with it.

 

I still think physics cards will eventually become popular.

 

 

I guess the big problem here is that the GeForce cards were still better at doing the rendering. I'm sure if the Voodoo was initially poorer than a CPU, the 3D accelerator craze would never have started off as strong as it did. Which is a definite disadvantage of the physics cards right now.

Posted

City of heroes supports Ageia PhyX.

Victor of the 5 year fan fic competition!

 

Kevin Butler will awesome your face off.

Posted
The problem with physics cards is that they're simply in alot of respects toned down GPU's, working through a PCI port, so naturally they're more on par in terms of power due to memory bandwidth as old PCI GFX cards surely!

 

(Don't quote me on that)

 

It's more likely we'll see GPU cards having PPU's on there aswell.

I've certainly read my fair share on hijacking GPU's to do collision detection information, a seperate PCI PPU seems fairly pointless to me.

Nope.

 

PPUs are different hardware. GPUs do a gazillion burst instructions on a texture (read-only, loads of bandwidth), whereas the PPU is a self-updating set of nested calculations (read/write, loads of bandwidth). Just because the GPU has loads more bandwidth (even several orders of magnitude more: the new ATi X1950 has DDR4 RAM, for example), it still isn't capable of completing the basic calculations PROPERLY.

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Posted (edited)
The problem with physics cards is that they're simply in alot of respects toned down GPU's, working through a PCI port, so naturally they're more on par in terms of power due to memory bandwidth as old PCI GFX cards surely!

 

(Don't quote me on that)

 

It's more likely we'll see GPU cards having PPU's on there aswell.

I've certainly read my fair share on hijacking GPU's to do collision detection information, a seperate PCI PPU seems fairly pointless to me.

Nope.

 

PPUs are different hardware. GPUs do a gazillion burst instructions on a texture (read-only, loads of bandwidth), whereas the PPU is a self-updating set of nested calculations (read/write, loads of bandwidth). Just because the GPU has loads more bandwidth (even several orders of magnitude more: the new ATi X1950 has DDR4 RAM, for example), it still isn't capable of completing the basic calculations PROPERLY.

 

The way in which a GPU and PPU operate is naturally different, but it deals with similar data sure?

 

I will agree with you that a GPU alone cannot properly complete the required calculations (In some cases it can for example fluid dynamics) but it can aid the CPU in doing so quickly, thus in my eyes making a PPU pretty worthless(except for ease), and the addition of hardware to a GPU, a kinda GPU+PPU much more viable than an extra physics card.

 

Particle Physics and Collision Detection are very viable, but perhaps not better than on a physics card. I was indicating that I am more inclined to think that added functinality to GFX cards is a better solution, that is all.

 

As for the bottleneck, I genuinely felt and still feel that data transfer rate via the PCI slot the PPU is held in is causing a hold up and it is not the GFX card to blame, that is my first though and has no assumption of truth.

Edited by @\NightandtheShape/@

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

Posted

Yeah, maybe the quad-core CPUs might make the PPU obsolete ... then again the PPU has to fight for PCI-e bandwidth: it might be a better performer if it were in a PCI-e(16).

 

Anyways, I think it is still too early to write the epitaph; who knows, AMD-ATi might buy Aegis, too, and make a killer daughterboard ... :ermm:

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Posted
Yeah, maybe the quad-core CPUs might make the PPU obsolete ... then again the PPU has to fight for PCI-e bandwidth: it might be a better performer if it were in a PCI-e(16).

 

Anyways, I think it is still too early to write the epitaph; who knows, AMD-ATi might buy Aegis, too, and make a killer daughterboard ... :ermm:

 

It certainly too easy to tell... I will say this, I do think physics is going to become as important as GFX cards are currently to the gaming industry, I'm currently wondering how viable it would be to use the Aegis technology for simulating a space game.

 

I'm right in thinking that the SDK supports software, aswell as hardware results.

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

Posted
I have heard it referred to as UT3; not that I have played the vanilla games (not my cup of tea).

 

I am REALLY slow (I know) but I think the quarter just dropped.

 

They meant to say UE3 right? (As in Unreal Engine 3, which UT2K7 will use first)

^

 

 

I agree that that is such a stupid idiotic pathetic garbage hateful retarded scumbag evil satanic nazi like term ever created. At least top 5.

 

TSLRCM Official Forum || TSLRCM Moddb || My other KOTOR2 mods || TSLRCM (English version) on Steam || [M4-78EP on Steam

Formerly known as BattleWookiee/BattleCookiee

Posted
Yeah, maybe the quad-core CPUs might make the PPU obsolete ... then again the PPU has to fight for PCI-e bandwidth: it might be a better performer if it were in a PCI-e(16).

 

Anyways, I think it is still too early to write the epitaph; who knows, AMD-ATi might buy Aegis, too, and make a killer daughterboard ... :thumbsup:

 

Although DX 10 is supposed to use slack bandwidth on GFX cards to do physic calculation though I know very little about DX 10 at the moment.

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

Posted

Unless they change the way the GPU works, as in feedback into the data to allow transformations on the data to ripple through it (i.e. gravity of objects affecting objects, not just a bajillion shadow calculations), then GPU calculations aren't going to help.

I have heard it referred to as UT3; not that I have played the vanilla games (not my cup of tea).

 

I am REALLY slow (I know) but I think the quarter just dropped.

 

They meant to say UE3 right? (As in Unreal Engine 3, which UT2K7 will use first)

That'd be it, yep. :)

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Posted
Unless they change the way the GPU works, as in feedback into the data to allow transformations on the data to ripple through it (i.e. gravity of objects affecting objects, not just a bajillion shadow calculations), then GPU calculations aren't going to help.

 

Just saying what I have read.

 

It won't be the same as a physics card, but it's a start.

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

  • 2 months later...
Posted

Interesting...

 

If Dragon Age has neat physics stuff thingies it might be worth buying.

This post is not to be enjoyed, discussed, or referenced on company time.

Posted
Interesting...

 

If Dragon Age has neat physics stuff thingies it might be worth buying.

Nonsense. DA doesn't use the Eclipse-Engine, the EE is still in development by Bio's TAG, and will be used for future games. That excludes DA (DA engine) and ME (UE3) though.

Maybe they're gonna use it in JE2, making it a more kinetic experience thanks to PhysX. And that's just one thing you could do with a PPU. Dynamics (Hair. No not the ugly Polygon clusters - real Hair; volumetric clouds; Fluids; Real-Time deformations etc. can also enhance the look) can also be calculated via PPUs. :devil:

 

>_< GO BIO GO! :alien:

  • 4 months later...
Posted

The Company That Just Wouldn't Quit:

 

http://uk.theinquirer.net/?article=38716

 

Die-shrink (130nm -> 80 nm), cheaper models, more different slot options (I have a few spare PCI-e..). So far I think the entire Ageia company has been living off of Bokishi's purchase as he was probably the only one who bought the thing. In the entire world. Seriously.

 

I'll probably end up buying one myself when they come down in price. Under $100 and I'll get one.

Swedes, go to: Spel2, for the latest game reviews in swedish!

Posted
PhysX one of those great ideas that has come and gone, very quickly at that.

 

Hasn't gone really.

 

PhysX may catch on when more Unreal 3 tech games get released.

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...