Financial firms, especially Wall Street giants, have been experimenting with using graphical processing units (GPUs, the graphics cards behind video games and other high-end streaming graphics) to run high-speed Monte Carlo simulations for analyzing risk and determining bond prices for years. GPUs can handle many calculations simultaneously, making them perfect for the calculations used to predict risk and performance over time for a variety of financial products and portfolios.
The challenge with GPUs has always been that they're hard to program. Existing programs have to be modified to run efficiently across GPUs.
Today, Numerical Algorithms Group, a not-for-profit numerical software research organization, has released a set of numeric routines designed to run on GPUs. They're geared toward financial quantitative analysts involved in options pricing, risk analysis, and algorithmic trading.
The latest release of NAG’s code contains routines for Quasi and Pseudo Random Number Generators, Brownian bridge, and associated statistical distributions.