30 November 2012Time: 2:30pm
Venue: GO Jones LG1
As high-performance computing reaches for the "exascale", challenges of reliability of components and overall power consumption start to become major obstacles of progress. A number of research groups have identified the strict adherence to "bit-reproducibility" at the hardware level as both costly in terms of power and necessitating a design with little tolerance for faults. A new approach is being investigated which allows for a trade-off between power consumption and/or performance, on the one hand, and accuracy on the other. Work is being done at the hardware, compiler and algorithm level into the implications of such an approach. I will present here some background on the work of computer scientists on "stochastic" or approximate hardware, and discuss some of the implications for the simulation of systems of ordinary and partial differential equations should such hardware become more common. Simple test systems include the Lorenz '96 ODE system and Burgers' equation.