Such a solution might be useful in small devices (palmtops and the like), possibly even in networked workstations for uniformity, but hardly attractive for high-end gaming and graphics. One of the articles I read about the merger did acknowledge that.
But as far as physics, etc. cards go, these additional 'helper' cards do seem to be a candidate for all-in-one integration on the graphics card itself. Until that happens, the technology won't become mainstream, because it's not enough of an advance to justify the cost or space requirement. (I think it's Nvidia's turn to roll.)
<{POST_SNAPBACK}>
When I say "high end integrated solution," I'm talking about ATi putting their high end graphics accelerators right on chip. Whether or not that's good enough for us is one thing, but it is significantly better than the integrated solution that Intel is using now.
Furthermore, if higher quality integrated graphics chips become more popular, the "lowest common denominator" in games gets raised.