September 2, 200619 yr Seeing how Physx did and how this would have even less noticible benefits -- I sense failure in this profucts future. But... maybe if it was cheap it could do better.
September 2, 200619 yr Not sure why this is "LOL" because essentially the entire Games Group at my University feels that this is the only way AI will get a fair shake in commercial games. Edited September 2, 200619 yr by alanschu
September 2, 200619 yr There will be specialist cards for everything soon. Sheesh. I only have so many slots in my motherboard and they are all used up. It just seems a bit ridiculous.
September 2, 200619 yr Thanks man! I downloaded Go from there. :cool: This post is not to be enjoyed, discussed, or referenced on company time.
September 2, 200619 yr Author I am saying LOL at the sudden rise of feature specific cards. Current 3DMark
September 2, 200619 yr Ah. As soon as I heard about PhysX, I figured an AI one wouldn't be too far away.
September 2, 200619 yr If it works as advertises there are number of games that can use it right now. Oblivion is one of them. RAI might look good on paper but wow, some of the things the NPCs do make very little sense.
September 2, 200619 yr If it works as advertises there are number of games that can use it right now. Oblivion is one of them. RAI might look good on paper but wow, some of the things the NPCs do make very little sense. <{POST_SNAPBACK}> Accelerated AI wouldn't make the AI in Oblivion better -- it might take a bit of the stress off the CPU though. A poor implementation is just that.. programming an AI like RAI was hyped to be would probably be a lot harder then finding the resources to make it run.
September 2, 200619 yr AI programming, thats a job I wouldnt touch with a three meter pole. Crikey. DISCLAIMER: Do not take what I write seriously unless it is clearly and in no uncertain terms, declared by me to be meant in a serious and non-humoristic manner. If there is no clear indication, asume the post is written in jest. This notification is meant very seriously and its purpouse is to avoid misunderstandings and the consequences thereof. Furthermore; I can not be held accountable for anything I write on these forums since the idea of taking serious responsability for my unserious actions, is an oxymoron in itself. Important: as the following sentence contains many naughty words I warn you not to read it under any circumstances; botty, knickers, wee, erogenous zone, psychiatrist, clitoris, stockings, bosom, poetry reading, dentist, fellatio and the department of agriculture. "I suppose outright stupidity and complete lack of taste could also be considered points of view. "
September 2, 200619 yr There will be specialist cards for everything soon. Sheesh. I only have so many slots in my motherboard and they are all used up. It just seems a bit ridiculous. Don't worry, Hades, within 2-3 years it's all going to be on-chip - Graphics, physics, AI, sound, network, wireless, memory controllers, hard disk controllers, everything. The only thing left on your motherboard will be power supply capacitors and memory slots.
September 2, 200619 yr Funny you should mention that, because I read an article talking about ATi/AMD creating a high end integrated solution.
September 3, 200619 yr Funny you should mention that, because I read an article talking about ATi/AMD creating a high end integrated solution. <{POST_SNAPBACK}> Such a solution might be useful in small devices (palmtops and the like), possibly even in networked workstations for uniformity, but hardly attractive for high-end gaming and graphics. One of the articles I read about the merger did acknowledge that. But as far as physics, etc. cards go, these additional 'helper' cards do seem to be a candidate for all-in-one integration on the graphics card itself. Until that happens, the technology won't become mainstream, because it's not enough of an advance to justify the cost or space requirement. (I think it's Nvidia's turn to roll.)
September 3, 200619 yr That would kick major ass, Alan. I wonder if Intel and Nvidia should join forces like ATI and AMD. Hmmmm...
September 3, 200619 yr Funny you should mention that, because I read an article talking about ATi/AMD creating a high end integrated solution. <{POST_SNAPBACK}> Such a solution might be useful in small devices (palmtops and the like), possibly even in networked workstations for uniformity, but hardly attractive for high-end gaming and graphics. One of the articles I read about the merger did acknowledge that. But as far as physics, etc. cards go, these additional 'helper' cards do seem to be a candidate for all-in-one integration on the graphics card itself. Until that happens, the technology won't become mainstream, because it's not enough of an advance to justify the cost or space requirement. (I think it's Nvidia's turn to roll.) <{POST_SNAPBACK}> When I say "high end integrated solution," I'm talking about ATi putting their high end graphics accelerators right on chip. Whether or not that's good enough for us is one thing, but it is significantly better than the integrated solution that Intel is using now. Furthermore, if higher quality integrated graphics chips become more popular, the "lowest common denominator" in games gets raised.
September 3, 200619 yr The problem with integrated solutions in regards to gaming is that gamers usually want to upgrade a few components well before the motherboard needs to be replaced. So I doubt the hardcore gamers will see it as viable. That is unless the rate at which new hardware gets released is significantly lowered, although I don't see that happening anytime soon. Mostly because that would cut into the profit margins of Nvidia and ATI. At least that's how I see things. Personally I prefer the modular approach, if something breaks down, I'd rather not have to replace everything. Although when I recently helped upgrade my mother's computer, I went with an integrated solution, but she has no demands whatsoever on performance so it hardly relates to gamers.
September 3, 200619 yr Wel I can see them making an intergrated CPU/GPU that will fet in the new AM2/AM3 socket. When one upgrades all he has to do is worry about is one chip.
September 3, 200619 yr That would mean CPU would have to compete with GPU for memory bus, as GPU is very memory intensive.
September 3, 200619 yr It would also mean that if I wanted to upgrade my GPU I'd need to upgrade the CPU as well. And CPUs do not need to be upgraded as frequently as the GPU does.
September 3, 200619 yr The first thing that i wanted to check up upon this solution was their so-called A * algorihtm. Problem is, that i have never heard of it. Any links? "Some men see things as they are and say why?""I dream things that never were and say why not?"- George Bernard Shaw"Hope in reality is the worst of all evils because it prolongs the torments of man."- Friedrich Nietzsche "The amount of energy necessary to refute bull**** is an order of magnitude bigger than to produce it." - Some guy
September 3, 200619 yr The first thing that i wanted to check up upon this solution was their so-called A * algorihtm. Problem is, that i have never heard of it. Any links? <{POST_SNAPBACK}> http://en.wikipedia.org/wiki/A-star_search_algorithm
September 3, 200619 yr The problem with integrated solutions in regards to gaming is that gamers usually want to upgrade a few components well before the motherboard needs to be replaced. So I doubt the hardcore gamers will see it as viable. That is unless the rate at which new hardware gets released is significantly lowered, although I don't see that happening anytime soon. Mostly because that would cut into the profit margins of Nvidia and ATI. At least that's how I see things. Personally I prefer the modular approach, if something breaks down, I'd rather not have to replace everything. Although when I recently helped upgrade my mother's computer, I went with an integrated solution, but she has no demands whatsoever on performance so it hardly relates to gamers. <{POST_SNAPBACK}> I don't think it'd be integrated into the motherboard, but rather the CPU chip itself. If motherboards still come with expansion slots, it'd be less restrictive. As for A*, it's pretty much the pathfinding algorithm. The funny thing is that they say they don't use heuristics, which means they aren't using A* (since A* is a heuristic). Edited September 3, 200619 yr by alanschu
Create an account or sign in to comment