Jump to content

Recommended Posts

Posted

Seeing how Physx did and how this would have even less noticible benefits -- I sense failure in this profucts future.

 

But... maybe if it was cheap it could do better.

Posted (edited)

Not sure why this is "LOL" because essentially the entire Games Group at my University feels that this is the only way AI will get a fair shake in commercial games.

Edited by alanschu
Posted

There will be specialist cards for everything soon. Sheesh. I only have so many slots in my motherboard and they are all used up. It just seems a bit ridiculous.

Posted

If it works as advertises there are number of games that can use it right now. Oblivion is one of them. RAI might look good on paper but wow, some of the things the NPCs do make very little sense.

Posted
If it works as advertises there are number of games that can use it right now.  Oblivion is one of them.  RAI might look good on paper but wow, some of the things the NPCs do make very little sense.

 

Accelerated AI wouldn't make the AI in Oblivion better -- it might take a bit of the stress off the CPU though.

 

A poor implementation is just that.. programming an AI like RAI was hyped to be would probably be a lot harder then finding the resources to make it run.

Posted

AI programming, thats a job I wouldnt touch with a three meter pole. Crikey.

DISCLAIMER: Do not take what I write seriously unless it is clearly and in no uncertain terms, declared by me to be meant in a serious and non-humoristic manner. If there is no clear indication, asume the post is written in jest. This notification is meant very seriously and its purpouse is to avoid misunderstandings and the consequences thereof. Furthermore; I can not be held accountable for anything I write on these forums since the idea of taking serious responsability for my unserious actions, is an oxymoron in itself.

 

Important: as the following sentence contains many naughty words I warn you not to read it under any circumstances; botty, knickers, wee, erogenous zone, psychiatrist, clitoris, stockings, bosom, poetry reading, dentist, fellatio and the department of agriculture.

 

"I suppose outright stupidity and complete lack of taste could also be considered points of view. "

Posted
There will be specialist cards for everything soon.  Sheesh.  I only have so many slots in my motherboard and they are all used up.  It just seems a bit ridiculous.

Don't worry, Hades, within 2-3 years it's all going to be on-chip - Graphics, physics, AI, sound, network, wireless, memory controllers, hard disk controllers, everything. The only thing left on your motherboard will be power supply capacitors and memory slots.

Posted
Funny you should mention that, because I read an article talking about ATi/AMD creating a high end integrated solution.

 

Such a solution might be useful in small devices (palmtops and the like), possibly even in networked workstations for uniformity, but hardly attractive for high-end gaming and graphics. One of the articles I read about the merger did acknowledge that.

 

But as far as physics, etc. cards go, these additional 'helper' cards do seem to be a candidate for all-in-one integration on the graphics card itself. Until that happens, the technology won't become mainstream, because it's not enough of an advance to justify the cost or space requirement. (I think it's Nvidia's turn to roll.)

Posted
Funny you should mention that, because I read an article talking about ATi/AMD creating a high end integrated solution.

 

Such a solution might be useful in small devices (palmtops and the like), possibly even in networked workstations for uniformity, but hardly attractive for high-end gaming and graphics. One of the articles I read about the merger did acknowledge that.

 

But as far as physics, etc. cards go, these additional 'helper' cards do seem to be a candidate for all-in-one integration on the graphics card itself. Until that happens, the technology won't become mainstream, because it's not enough of an advance to justify the cost or space requirement. (I think it's Nvidia's turn to roll.)

 

 

When I say "high end integrated solution," I'm talking about ATi putting their high end graphics accelerators right on chip. Whether or not that's good enough for us is one thing, but it is significantly better than the integrated solution that Intel is using now.

 

Furthermore, if higher quality integrated graphics chips become more popular, the "lowest common denominator" in games gets raised.

Posted

The problem with integrated solutions in regards to gaming is that gamers usually want to upgrade a few components well before the motherboard needs to be replaced. So I doubt the hardcore gamers will see it as viable.

 

That is unless the rate at which new hardware gets released is significantly lowered, although I don't see that happening anytime soon. Mostly because that would cut into the profit margins of Nvidia and ATI.

 

At least that's how I see things. Personally I prefer the modular approach, if something breaks down, I'd rather not have to replace everything. Although when I recently helped upgrade my mother's computer, I went with an integrated solution, but she has no demands whatsoever on performance so it hardly relates to gamers.

:huh:

Posted

It would also mean that if I wanted to upgrade my GPU I'd need to upgrade the CPU as well. And CPUs do not need to be upgraded as frequently as the GPU does.

Posted

The first thing that i wanted to check up upon this solution was their so-called A * algorihtm. Problem is, that i have never heard of it. Any links?

"Some men see things as they are and say why?"
"I dream things that never were and say why not?"
- George Bernard Shaw

"Hope in reality is the worst of all evils because it prolongs the torments of man."
- Friedrich Nietzsche

 

"The amount of energy necessary to refute bull**** is an order of magnitude bigger than to produce it."

- Some guy 

Posted (edited)
The problem with integrated solutions in regards to gaming is that gamers usually want to upgrade a few components well before the motherboard needs to be replaced. So I doubt the hardcore gamers will see it as viable.

 

That is unless the rate at which new hardware gets released is significantly lowered, although I don't see that happening anytime soon. Mostly because that would cut into the profit margins of Nvidia and ATI.

 

At least that's how I see things. Personally I prefer the modular approach, if something breaks down, I'd rather not have to replace everything. Although when I recently helped upgrade my mother's computer, I went with an integrated solution, but she has no demands whatsoever on performance so it hardly relates to gamers.

:thumbsup:

 

I don't think it'd be integrated into the motherboard, but rather the CPU chip itself. If motherboards still come with expansion slots, it'd be less restrictive.

 

As for A*, it's pretty much the pathfinding algorithm. The funny thing is that they say they don't use heuristics, which means they aren't using A* (since A* is a heuristic).

Edited by alanschu

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...