Oh believe me, I am well aware of this. Press releases from both Microsoft and Sony seem to be pressing the issue of bigger and better graphics.
I don't have too much experience with workstation apps. I have heard that much of the insane memory requirements have to do with high end photo/video editting. There is 3D rendering though, in CAD files. I'm not sure what the differences would be between using CAD compared to playing a game. I do find it kind of interesting that much of the content creation for games is done on hardware that tends to outstrip the game it ultimately is played on.
I'm not sure the differences there are between content creation and simply viewing the content though. I know I use a computer that is significantly faster than necessary for my software to run at an acceptable speed because I spend a lot of time doing repeated tasks, such as compiling. Memory usage goes up because I often load up the program with debug mode on, cycle through and look at source code in my IDE...all while running the program. I wonder if stuff such as this is part of the reason why RAM requirements go up with workstation applications. Content creation does seem more involved than merely viewing that content.
I guess I can't really dispute that they could use computationally intense algorithms that require a lot of memory. It's just that in my experiences, stuff becomes computationally expensive because of the memory requirement. Memory is much more finite than time, so we find ways to limit memory usage, resulting in increases in computation time. Since you mentioned anti-aliasing, the idea of rip-maps is not popular due to memory requirements, so they dynamically create tables which is more computationally expensive.
Well, if the assumption that half of the total memory is going to be used for graphics anyways, then I guess you could still have the SPE's work on geometry, as RSX's memory works on other things. The Cell can still write to graphics memory really fast, and the RSX can read from the main memory really fast.
As for the other uses of the SPE's. I think it might be our previous experiences that may be clouding the issue. Historically floating point operations have been significantly slower than integer operations. As a result, many abstractions of floating point are made, converting them into integer ops in order to speed them up. If the Cell is really as much of a FP beast as Sony and IBM say it is, then I'm curious if the abstractions we make will be as necessary (and maybe have the benefit of limiting memory requirements as a latent benefit).
In the old thread (before you even registered here), I suggested the possibility that AI could possibily benefit, as the calculation of continuous Random Variables to help make decisions at nodes more accurate and perhaps more dynamic. My experience with ORTS (you can see me, Allan Schumacher, worked on it last summer) had me working on breaking the map down into triangles (and not just the 3D rendering triangles, so I couldn't use that information) in order to help reduce the search space for A* pathfinding. Currently, most games make a tile based abstraction, and I believe Bioware mentioned they plan on going back to a tile-based pathfinding system for Dragon Age from I think a hexagon based system for Jade Empire, and I think KOTOR. A hurdle we found with breaking the world down into triangles is that they weren't uniform (we'd pick ends of line segments and connect them to the corners of other line segments). Determining location became a bit more problematic, especially if a triangle was very large (for instance, a Square room would be cut into an X as the corners connected each other, so you could have 4 large triangles). The situation ended up becoming more floating point intensive, and was starting to see diminishing returns in performance (it was also complicated...so maybe the PS3 is teh suck!).
We also analyzed combat AI, specifically concentration of fire. We solved n vs. 1, but trying to decide which units our guys should concentrate their firepower on in n vs. many was much more difficult. We ended up using a simple Monte Carlo and iterating over the possibilities to see if we could find a pattern to the results and perhaps some sort of equation (and maybe even a proof!). We ended up playing around with ratios of firepower, cooldown, and hitpoints (we didn't want to consider the fact that some units might do different damage depending on the unit). As units get added/removed from battle, we ended up doing a lot of ratio calculations trying to determine the most efficient way to kill the enemy units. It ended up having a fair bit of repeated floating point calculations as a result.
I do know that robotics AI has been hindered by limited Floating Point power. Interesting Read
On a final note, anything that could benefit from statistics (particularly of the Random Variable kind) could find itself benefitting from the SPEs. We're just familiar with Physics and Graphics right now, because we can't really "cheat" and abstract things into integers for these.