Bokishi Posted September 2, 2006 Share Posted September 2, 2006 LOL Current 3DMark Link to comment Share on other sites More sharing options...
LostStraw Posted September 2, 2006 Share Posted September 2, 2006 Seeing how Physx did and how this would have even less noticible benefits -- I sense failure in this profucts future. But... maybe if it was cheap it could do better. Link to comment Share on other sites More sharing options...
Diamond Posted September 2, 2006 Share Posted September 2, 2006 At least it is better than Killer NIC. Link to comment Share on other sites More sharing options...
alanschu Posted September 2, 2006 Share Posted September 2, 2006 (edited) Not sure why this is "LOL" because essentially the entire Games Group at my University feels that this is the only way AI will get a fair shake in commercial games. Edited September 2, 2006 by alanschu Link to comment Share on other sites More sharing options...
Judge Hades Posted September 2, 2006 Share Posted September 2, 2006 There will be specialist cards for everything soon. Sheesh. I only have so many slots in my motherboard and they are all used up. It just seems a bit ridiculous. Link to comment Share on other sites More sharing options...
kirottu Posted September 2, 2006 Share Posted September 2, 2006 Thanks man! I downloaded Go from there. :cool: This post is not to be enjoyed, discussed, or referenced on company time. Link to comment Share on other sites More sharing options...
alanschu Posted September 2, 2006 Share Posted September 2, 2006 Hahahaha. Enjoy. Link to comment Share on other sites More sharing options...
Bokishi Posted September 2, 2006 Author Share Posted September 2, 2006 I am saying LOL at the sudden rise of feature specific cards. Current 3DMark Link to comment Share on other sites More sharing options...
alanschu Posted September 2, 2006 Share Posted September 2, 2006 Ah. As soon as I heard about PhysX, I figured an AI one wouldn't be too far away. Link to comment Share on other sites More sharing options...
Judge Hades Posted September 2, 2006 Share Posted September 2, 2006 If it works as advertises there are number of games that can use it right now. Oblivion is one of them. RAI might look good on paper but wow, some of the things the NPCs do make very little sense. Link to comment Share on other sites More sharing options...
LostStraw Posted September 2, 2006 Share Posted September 2, 2006 If it works as advertises there are number of games that can use it right now. Oblivion is one of them. RAI might look good on paper but wow, some of the things the NPCs do make very little sense. <{POST_SNAPBACK}> Accelerated AI wouldn't make the AI in Oblivion better -- it might take a bit of the stress off the CPU though. A poor implementation is just that.. programming an AI like RAI was hyped to be would probably be a lot harder then finding the resources to make it run. Link to comment Share on other sites More sharing options...
alanschu Posted September 2, 2006 Share Posted September 2, 2006 I'm not sure I agree with their philosophy on heuristics. Link to comment Share on other sites More sharing options...
Kaftan Barlast Posted September 2, 2006 Share Posted September 2, 2006 AI programming, thats a job I wouldnt touch with a three meter pole. Crikey. DISCLAIMER: Do not take what I write seriously unless it is clearly and in no uncertain terms, declared by me to be meant in a serious and non-humoristic manner. If there is no clear indication, asume the post is written in jest. This notification is meant very seriously and its purpouse is to avoid misunderstandings and the consequences thereof. Furthermore; I can not be held accountable for anything I write on these forums since the idea of taking serious responsability for my unserious actions, is an oxymoron in itself. Important: as the following sentence contains many naughty words I warn you not to read it under any circumstances; botty, knickers, wee, erogenous zone, psychiatrist, clitoris, stockings, bosom, poetry reading, dentist, fellatio and the department of agriculture. "I suppose outright stupidity and complete lack of taste could also be considered points of view. " Link to comment Share on other sites More sharing options...
angshuman Posted September 2, 2006 Share Posted September 2, 2006 There will be specialist cards for everything soon. Sheesh. I only have so many slots in my motherboard and they are all used up. It just seems a bit ridiculous. Don't worry, Hades, within 2-3 years it's all going to be on-chip - Graphics, physics, AI, sound, network, wireless, memory controllers, hard disk controllers, everything. The only thing left on your motherboard will be power supply capacitors and memory slots. Link to comment Share on other sites More sharing options...
alanschu Posted September 2, 2006 Share Posted September 2, 2006 Funny you should mention that, because I read an article talking about ATi/AMD creating a high end integrated solution. Link to comment Share on other sites More sharing options...
kalimeeri Posted September 3, 2006 Share Posted September 3, 2006 Funny you should mention that, because I read an article talking about ATi/AMD creating a high end integrated solution. <{POST_SNAPBACK}> Such a solution might be useful in small devices (palmtops and the like), possibly even in networked workstations for uniformity, but hardly attractive for high-end gaming and graphics. One of the articles I read about the merger did acknowledge that. But as far as physics, etc. cards go, these additional 'helper' cards do seem to be a candidate for all-in-one integration on the graphics card itself. Until that happens, the technology won't become mainstream, because it's not enough of an advance to justify the cost or space requirement. (I think it's Nvidia's turn to roll.) Link to comment Share on other sites More sharing options...
Judge Hades Posted September 3, 2006 Share Posted September 3, 2006 That would kick major ass, Alan. I wonder if Intel and Nvidia should join forces like ATI and AMD. Hmmmm... Link to comment Share on other sites More sharing options...
alanschu Posted September 3, 2006 Share Posted September 3, 2006 Funny you should mention that, because I read an article talking about ATi/AMD creating a high end integrated solution. <{POST_SNAPBACK}> Such a solution might be useful in small devices (palmtops and the like), possibly even in networked workstations for uniformity, but hardly attractive for high-end gaming and graphics. One of the articles I read about the merger did acknowledge that. But as far as physics, etc. cards go, these additional 'helper' cards do seem to be a candidate for all-in-one integration on the graphics card itself. Until that happens, the technology won't become mainstream, because it's not enough of an advance to justify the cost or space requirement. (I think it's Nvidia's turn to roll.) <{POST_SNAPBACK}> When I say "high end integrated solution," I'm talking about ATi putting their high end graphics accelerators right on chip. Whether or not that's good enough for us is one thing, but it is significantly better than the integrated solution that Intel is using now. Furthermore, if higher quality integrated graphics chips become more popular, the "lowest common denominator" in games gets raised. Link to comment Share on other sites More sharing options...
Spider Posted September 3, 2006 Share Posted September 3, 2006 The problem with integrated solutions in regards to gaming is that gamers usually want to upgrade a few components well before the motherboard needs to be replaced. So I doubt the hardcore gamers will see it as viable. That is unless the rate at which new hardware gets released is significantly lowered, although I don't see that happening anytime soon. Mostly because that would cut into the profit margins of Nvidia and ATI. At least that's how I see things. Personally I prefer the modular approach, if something breaks down, I'd rather not have to replace everything. Although when I recently helped upgrade my mother's computer, I went with an integrated solution, but she has no demands whatsoever on performance so it hardly relates to gamers. Link to comment Share on other sites More sharing options...
Judge Hades Posted September 3, 2006 Share Posted September 3, 2006 Wel I can see them making an intergrated CPU/GPU that will fet in the new AM2/AM3 socket. When one upgrades all he has to do is worry about is one chip. Link to comment Share on other sites More sharing options...
Diamond Posted September 3, 2006 Share Posted September 3, 2006 That would mean CPU would have to compete with GPU for memory bus, as GPU is very memory intensive. Link to comment Share on other sites More sharing options...
Spider Posted September 3, 2006 Share Posted September 3, 2006 It would also mean that if I wanted to upgrade my GPU I'd need to upgrade the CPU as well. And CPUs do not need to be upgraded as frequently as the GPU does. Link to comment Share on other sites More sharing options...
Meshugger Posted September 3, 2006 Share Posted September 3, 2006 The first thing that i wanted to check up upon this solution was their so-called A * algorihtm. Problem is, that i have never heard of it. Any links? "Some men see things as they are and say why?""I dream things that never were and say why not?"- George Bernard Shaw"Hope in reality is the worst of all evils because it prolongs the torments of man."- Friedrich Nietzsche "The amount of energy necessary to refute bull**** is an order of magnitude bigger than to produce it." - Some guy Link to comment Share on other sites More sharing options...
LostStraw Posted September 3, 2006 Share Posted September 3, 2006 The first thing that i wanted to check up upon this solution was their so-called A * algorihtm. Problem is, that i have never heard of it. Any links? <{POST_SNAPBACK}> http://en.wikipedia.org/wiki/A-star_search_algorithm Link to comment Share on other sites More sharing options...
alanschu Posted September 3, 2006 Share Posted September 3, 2006 (edited) The problem with integrated solutions in regards to gaming is that gamers usually want to upgrade a few components well before the motherboard needs to be replaced. So I doubt the hardcore gamers will see it as viable. That is unless the rate at which new hardware gets released is significantly lowered, although I don't see that happening anytime soon. Mostly because that would cut into the profit margins of Nvidia and ATI. At least that's how I see things. Personally I prefer the modular approach, if something breaks down, I'd rather not have to replace everything. Although when I recently helped upgrade my mother's computer, I went with an integrated solution, but she has no demands whatsoever on performance so it hardly relates to gamers. <{POST_SNAPBACK}> I don't think it'd be integrated into the motherboard, but rather the CPU chip itself. If motherboards still come with expansion slots, it'd be less restrictive. As for A*, it's pretty much the pathfinding algorithm. The funny thing is that they say they don't use heuristics, which means they aren't using A* (since A* is a heuristic). Edited September 3, 2006 by alanschu Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now