|Volume 7, Issue 1 -
Large, Sparse Jacobian Matrices Computed Accurately and Efficiently with Automatic Differentiation
Brett Averick, Christian Bischof, Andreas Griewank, and Jorge More, Argonne National Laboratory; Alan Carle, Rice University
A team of researchers from Argonne National Laboratory and Rice University has successfully used automatic differentiation techniques to compute large, sparse Jacobian matrices. On a variety of test problems, automatic differentiation outperformed traditional methods in both speed and accuracy. The research dispelled the common misconception that automatic differentiation is unable to handle large problems effectively.
Two conventional numerical methods exist for computing Jacobian matrices: function differences and hand-coding of derivatives. The former suffers from possible inaccuracy,particularly if the problem is highly nonlinear. The latter method is time-consuming and error prone; a new coding effort is required whenever a function is modified.
The automatic differentiation techniques overcome these limitations. Automatic differentiation relies on the fact that every function, no matter how complicated, is evaluated on a computer as a (potentially very long) sequence of elementary operations such as additions, multiplications, and elementary functions such as the trigonometric and exponential functions.
By using ADIFOR (Automatic Differentiation of Fortran), the Argonne/Rice team was able to compute derivative information quickly and exactly. Automatic differentiation not only outperformed forward and central difference approximations in terms of accuracy and speed, but also proved to be comparable in speed and just as accurate as hand-coded derivatives. The results are of special interest in applications involving sensitivity analysis and parameter identification.
Support for this research was provided in part by the Office of Scientific Computing, U.S. Department of Energy, and the Center for Research on Parallel Computation.
Table of Contents