I disagree. Why would AI be a RAM hog? The two big concerns are always time and space, and many algorithms have been created that can store AI searches in linear space. They still have the exponential factor with respect to time though. There are many ways to trim branches of trees so that they are not stored in memory, but you'll still eventually need to traverse them, so the time consideration comes into play.
I'll admit I'm no expert, but algorithms such as IDA* were created because running out of memory is much, much worse than running out of time. A standard DFS will keep things linear in terms of memory usage, but runs the potential of hitting a black hole to which it goes on forever without finding a solution. IDA* ensures that other branches get checked, but has a drawback of going over nodes multiple times. For most useful algorithms, it seems as though the time explodes much faster than space requirements. Space requirements can also be reduced by an appropriate heuristic. However, depending on the type of heuristic, that calculation could be expensive, time wise.
I actually emailed a professor at my University, Jonathan Schaeffer (A quick bio). According to him, he feels that AI will continue to be pushed to the back burner unless someone makes an add on card for AI purposes.