It seems that GPU computing is becoming more and more ubiquitous. Tools are becoming better and better, the applications are becoming wider and wider and the hardware is becoming faster and faster too.
CPU are steadily gaining in productivity. It's a steadily rising curve. Is it good news though? Depends how you look at it. It's good that the speeds and efficiency are improving, but apparently CPUs are improving much slower than GPUs. GPUs computational power for suitable problems (like number crunching) was apriory much higher to start with, but what's really important is that GPUs are moving much faster i.e. the curve of the speedup is much sharper than that of CPUs.
For about $500 you can get the Nvidia GTX 480 right now with 400+ cores and super fast memory. GPUs in general also require much less space and power and thus are in general much more efficient for heavy number crunching computations than CPUs. Of course GPUs are quite limited in their capabilities and are really suitable for specific tasks, but there are more and more areas where people are starting to employ the power of highly parallel GPU computing with lightweight threads.
There is another caveat with GPU - it's the slow bus connection. The bus is too slow for the speeds of GPU and there is nothing you can do about it at the moment. One thing that can be done since GPU has so much computing power in parallel, is to compress the data that is passed between the GPU and the rest of the system. 1 to 10 ratio is the standard compression ratio of text, so you can increase the throughput of the bus by a factor of 10 using compression. Of course there is still some overhead and you should make an experiment and see whether this is beneficial in your setup.
Overall though, it seems pretty exciting area at the moment and the true technoratis should definitely pay attention to that area.