Volume 7, Issue 1 -
Spring/Summer 1999

Volume 6, Issue 3
Fall 1998

Volume 6, Issue 2
Spring/Summer 1998

Volume 6, Issue 1
Winter 1998

Volume 5, Issue 4
Fall 1997

Volume 5, Issue 3
Summer 1997

Volume 5, Issue 2
Spring 1997

Volume 5, Issue 1
Winter 1997

Volume 4, Issue 4
Fall 1996

Volume 4, Issue 3
Summer 1996

Volume 4, Issue 2
Spring 1996

Volume 4, Issue 1
Winter 1996

Volume 3, Issue 4
Fall 1995

Volume 3, Issue 3
Summer 1995

Volume 3, Issue 2
Spring 1995

Volume 3, Issue 1
January 1995

Volume 2, Issue 4
October 1994

Volume 2, Issue 3
July 1994

Volume 2, Issue 2
April 1994

Volume 2, Issue 1
January 1994

Volume 1, Issue 4
October 1993

Volume 1, Issue 3
July 1993

Volume 1, Issue 2
April 1993

Volume 1, Issue 1
January 1993

Large, Sparse Jacobian Matrices Computed Accurately and Efficiently with Automatic Differentiation

Brett Averick, Christian Bischof, Andreas Griewank, and Jorge More, Argonne National Laboratory; Alan Carle, Rice University


A team of researchers from Argonne National Laboratory and Rice University has successfully used automatic differentiation techniques to compute large, sparse Jacobian matrices. On a variety of test problems, automatic differentiation outperformed traditional methods in both speed and accuracy. The research dispelled the common misconception that automatic differentiation is unable to handle large problems effectively.

Two conventional numerical methods exist for computing Jacobian matrices: function differences and hand-coding of derivatives. The former suffers from possible inaccuracy,particularly if the problem is highly nonlinear. The latter method is time-consuming and error prone; a new coding effort is required whenever a function is modified.

The automatic differentiation techniques overcome these limitations. Automatic differentiation relies on the fact that every function, no matter how complicated, is evaluated on a computer as a (potentially very long) sequence of elementary operations such as additions, multiplications, and elementary functions such as the trigonometric and exponential functions.

By using ADIFOR (Automatic Differentiation of Fortran), the Argonne/Rice team was able to compute derivative information quickly and exactly. Automatic differentiation not only outperformed forward and central difference approximations in terms of accuracy and speed, but also proved to be comparable in speed and just as accurate as hand-coded derivatives. The results are of special interest in applications involving sensitivity analysis and parameter identification.

Support for this research was provided in part by the Office of Scientific Computing, U.S. Department of Energy, and the Center for Research on Parallel Computation.


Table of Contents

News | From the Director | Parallel Profile | Research Focus | Work in Progress | Education / Outreach | Resources | Calendar | CRPC Home