Volume 7, Issue 1 
Spring/Summer 1999
Volume 6, Issue 3
Fall 1998
Volume 6, Issue 2
Spring/Summer 1998
Volume 6, Issue 1
Winter 1998
Volume
5, Issue 4
Fall 1997
Volume
5, Issue 3
Summer 1997
Volume
5, Issue 2
Spring 1997
Volume
5, Issue 1
Winter 1997
Volume
4, Issue 4
Fall 1996
Volume
4, Issue 3
Summer 1996
Volume
4, Issue 2
Spring 1996
Volume
4, Issue 1
Winter 1996
Volume
3, Issue 4
Fall 1995
Volume
3, Issue 3
Summer 1995
Volume
3, Issue 2
Spring 1995
Volume
3, Issue 1
January 1995
Volume
2, Issue 4
October 1994
Volume
2, Issue 3
July 1994
Volume
2, Issue 2
April 1994
Volume
2, Issue 1
January 1994
Volume 1, Issue 4
October 1993
Volume
1, Issue 3
July 1993
Volume
1, Issue 2
April 1993
Volume
1, Issue 1
January 1993

RESEARCH FOCUS: THE BINARY BLACK HOLE GRAND CHALLENGE PROJECT
Richard
Matzner, James C. Browne, University of Texas; Larry Smarr, H. Ed
Seidel, Paul Saylor, Faisal Saied, University of Illinois; Geoffrey
Fox, Syracuse University; Stuart Shapiro, Saul Teukolsky, Cornell
University; James York, Charles Evans, University of North Carolina;
L. Samuel Finn, Northwestern University; Pablo Laguna, Penn State
University; Jeffrey Winicour, University of Pittsburgh
The object of the Binary Black Holes Grand Challenge Alliance is to
produce a catalog of the gravitational wave signatures from the strong
gravitational field of orbiting astrophysical binary black holes and
from the merger and coalescence of these bodies. This work will be
carried out by a computational solution of Einstein's equations
describing gravitational fields. This is a large computational problem
because of the complexity of the equations, the need for a full three
dimensional simulation for data with no special symmetries, and the
fact that analysis predicts a range of spatial and temporal scales from
the phenomena of the process.
The project group is interested in black holes as sources for
gravitational radiation because they are the most compact objects and
have the strongest gravitational fields theoretically possible. The
basic concept of a black hole can be explained by examining the concept
of escape velocity, which is determined by the minimum energy needed to
overcome the gravity of an astrophysical body and fly off into space.
For instance, the escape velocity from the earth is 11.2 km/sec. If
mass were added to the earth (without changing its radius), the escape
velocity would be greater. To get a black hole the size of the earth,
one would have to fit about 2000 times the mass of the sun into a sphere
of the earth's radius. The escape velocity would then exceed the
velocity of light, and light or anything else for that matter, would
not escape from the surface.
Although the concepts are simple, the implementation is intricate.
Einstein described gravity as a warping of space and time (spacetime);
to find consistent solutions for the gravitational configuration
requires the solution of a complicated set of coupled elliptical and
hyperbolic equations. Furthermore, black hole spacetime phenomena occur
on a succession of scales. The smallest scale arises by defining the
hole radius as unity. Then, another scale is the wavelength associated
with ringdown oscillations of the black hole, about 20 times as large.
Adequate extraction of the radiation waveforms requires an extraction
zone of hundreds of times the hole radius, setting a large outer scale
for the problem.
The nested structure thus implies the need for large grids (~(103)3) and
for adaptive refinements to handle changing scales as the black holes
orbit through the grid. Large computers are needed to run these
problems. Therefore, the group is developing parallel codes on machines
including the Cray T3D at the Pittsburgh Supercomputing Center and the
Thinking Machines CM5 at the National Center for Supercomputing
Applications (Illinois), as well as on architectures like the Pittsburgh
Cray C90 and the Cray YMP at the University of Texas at Austin.
The surface of a black hole is a causal boundary, analogous to a
supersonic point in fluid mechanics. Signals can propagate in only one
direction across it. Thus it is sensible and beneficial to consider
differencing that computes only the outside domain of the problem, which
can radiate to infinity. Because the locations of the causal boundary
depend on the (computational) solutions obtained, it is clear that some
sort of iterative procedure will be needed to roughly estimate the
gravitational field, roughly locate the causal boundaries, and then
refine.
The study of the gravitational field can be cast in one of two
equivalent ways that differ greatly in the way they describe the
gravitational field. One way treats the gravitational field as a
configuration with given derivatives that is evolved by hyperbolic
equations into the future of the problem. The second method treats the
threedimensional characteristics of the problem, which are the loci of
the signal flow from events in the central region out to infinity, and
evolves the system from one to the next of these characteristic
hypersurfaces. The characteristic approach has great a priori utility in
getting radiation to propagate to infinity, but there is much more
experience in handling the spaceevolvingintime approach (called the
Cauchy approach) in the central strong field region. Techniques are
needed for joining two substantially different descriptions at some
boundary surrounding the central, dynamical part of the simulation.
Managing such a complicated problem among a number of research centers
dictates a structured computational approach. With Browne and Matt
Choptuik (a research scientist at the University of Texas), CRPC
researcher Geoffrey Fox has begun developing the computational backbone
for this project. At a meeting on May 6 (in Pittsburgh), the coPIs of
the project adopted a parallel datastructure standard. Fox has set up
a framework with a software librarian for code modules meeting the
standard. A closed Mosaic repository is being established that will
allow PIs to check out modules for development. An initial toolkit and
backbone will be deposited to the librarian at Syracuse and be made
available via Mosaic.
Current work is proceeding in Fortran 77 and Fortran 90, and development
work for High Performance Fortran (HPF) is being carried out to assure
that HPF provides the compiler and runtime support appropriate to this
problem.
Table of Contents
