Jump to content

DX 10 for your XP


Hell Kitty

Recommended Posts

Excellent rebuttal. :lol:

http://www.theinquirer.net/default.aspx?article=35110

 

As long as I can play the sweet, sweet candy that is Alan Wake on my XP I'll be a happy little hell kitty.

It is possible (and likely) that DirectX9.0L will be inferior to DirectX10; afterall I was under the (mis-?) apprehension that DirectX10 included Shader Model 4; should DirectX9.0L emulate DirectX10, it will have to (at best) make a "best guess" at performing additional calculations ... and there isn't much processing power left on the current x9xx ATi and nVidia generations of cards.

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

Even if it's an "inferior" version or emulation, perhaps it'll at least enable XP users to play those games, albeit with less performance, meaning those not wanting/unable to switch to Vista will at least have an option.

“Things are as they are. Looking out into the universe at night, we make no comparisons between right and wrong stars, nor between well and badly arranged constellations.” – Alan Watts
Link to comment
Share on other sites

Maybe it's built for the newer cards as well and contains code that would allow them to utilize Shader Model 4?

Sure, the newer (not yet released!) cards should have no backward-compatibility problems ... I am just a little underwelmed with the current top-of-the-range cards if one has a large widescreen panel.

 

Cards without a Shader Model just don't draw the effect (like ATi cards without Shader Model 3, or nVidia cards without AA support for SM3).

OBSCVRVM PER OBSCVRIVS ET IGNOTVM PER IGNOTIVS

ingsoc.gif

OPVS ARTIFICEM PROBAT

Link to comment
Share on other sites

Cards without a Shader Model just don't draw the effect (like ATi cards without Shader Model 3, or nVidia cards without AA support for SM3).

Developers need to rewrite all the shaders in the engine for the multiple code paths (SM2, SM3, SM4 etc). Most codepaths are backwards-compatible (SM2 code will run on an SM3 card), but you can't run SM3 code on an X800 ATi, you'll have to supply SM2 shaders with the engine as well.

Link to comment
Share on other sites

Cards without a Shader Model just don't draw the effect (like ATi cards without Shader Model 3, or nVidia cards without AA support for SM3).

Developers need to rewrite all the shaders in the engine for the multiple code paths (SM2, SM3, SM4 etc). Most codepaths are backwards-compatible (SM2 code will run on an SM3 card), but you can't run SM3 code on an X800 ATi, you'll have to supply SM2 shaders with the engine as well.

 

I figured this was common place?

RS_Silvestri_01.jpg

 

"I'm a programmer at a games company... REET GOOD!" - Me

Link to comment
Share on other sites

Cards without a Shader Model just don't draw the effect (like ATi cards without Shader Model 3, or nVidia cards without AA support for SM3).

Developers need to rewrite all the shaders in the engine for the multiple code paths (SM2, SM3, SM4 etc). Most codepaths are backwards-compatible (SM2 code will run on an SM3 card), but you can't run SM3 code on an X800 ATi, you'll have to supply SM2 shaders with the engine as well.

 

I figured this was common place?

AFAIK very few games of the GeForce6/X800 era actually used SM3.0 shaders. Most devs didn't bother with multiple code paths and stuck to the safe SM2.0. In fact, I think this is true of most games even today, even though both ATi and NV now have SM3.0 products.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...