CRPC Logo  
Fortran Parallel Programming Systems

Primary Contact: Alan Carle (carle@cs.rice.edu)

Ken Kennedy (Project Director), Vikram Adve, Ruth Aydt, Rajesh Bordawekar, Alan Carle, Bryan Carpenter, Alok Choudhary, Keith Cooper, Ian Foster, Geoffrey Fox, Robert Fowler, Bill Gropp, Paul Havlak, Chuck Koelbel, David Kohr, Rakesh Krishnaiyer, Don Leskiw, Xiaming Li, Rusty Lusk, Nat McIntosh, John Mellor-Crummey, Paul Messina, Michael Paleczny, Dan Reed, Joel Saltz, Alan Sussman, Rajeev Thakur, Linda Torczon, Yuhong Wen, and Zhang Ying.

Abstract Corporate
Sponsors
Major
Accomplishments
Projects &
Software
CRPC Research
Home Page
 

Abstract.The objective of the Fortran Parallel Programming Systems project is to make parallel computer systems usable for programmers who work in Fortran, a language used throughout the scientific and engineering community. In this effort, a special emphasis is placed on data-parallel programming and scalable parallelism through the use of advanced compilation strategies.  

Corporate Sponsors include HP-Convex, Schlumberger, and Texas Instruments.

 

Major Accomplishments

 

The Fortran Parallel Programming Systems effort

  • Defined the original Fortran D language and carried out some of the fundamental research in data-parallel compilation and programming environments that validated the approach. The Rice/Syracuse Fortran D group was a key contributor to the development of High Performance Fortran (HPF).

  • Organized and led the High Performance Fortran Forum, which produced HPF, an industry-standard dialect of Fortran for data-parallel computation.

  • Developed and prototyped technologies that will be essential for developing efficient HPF programs. These include automatic data layout selection (based on efficient performance prediction and integer programming) and source-level performance analysis (based on compiler-supported techniques for correlating measured performance data with HPF source code). A new GUI for performance analysis of HPF codes compiled by the PGI HPF compiler will be made available as a major support tool for the new NCSA PACI alliance.

  • Pioneered new programming language features and implementation techniques to support both irregular and out-of-core data-parallel problems. Collaborated with Maryland and Syracuse to develop and demonstrate the importance of high performance runtime library support for irregular applications and out-of-core computations, and to develop the compiler analysis techniques to exploit these runtime functions.

  • Continues to develop advanced compilation techniques for HPF, including static program analysis and code generation techniques, sophisticated optimizations, support for emerging distributed shared-memory systems, and advanced compiler support for programming tools.

 

Projects & Software

See also Alphabetical Listing of All Projects & Software

 

Support for Machine-Independent Parallel Programming Performance Tools for HPF

  • Pablo and HPF - High-level performance analysis tools coupling dynamic performance data with compile-time information for data parallel programs. (Publications)
Tools for Scientific and Engineering Computation Scalable I/O

  • The Scalable I/O Initiative - Researchers at more than 30 institutions are systematically investigating the primary obstacle to effective use of current and future massively scalable computing systems -- getting data into, around, and out of the system. (Publications)
Combining HPF and MPI

  • HPF/MPI - A standard set of functions for coupling multiple HPF tasks to form task-parallel computations using MPI. (Publications)
Parallel Common Runtime Consortium

  • Parallel Common Runtime Consortium (PCRC) - The goal of the consortium is to develop a public domain software runtime environment which can be used by essentially all high level data parallel language compilers including, especially, C++ and HPF. (Publications)

  • Java - We are investigating the use of an interpreted frontend for Java in conjunction with the parallel common runtime library.
Compilation Techniques for Irregular Problem Support

  • Chaos - New techniques for compiling irregular problems written in HPF. (Publications)
HPF Testbed Application Development
CRPC Research
Home Page
Alphabetical Listing of All
Projects & Software
CRPC Home Page

Updated by Debbie Campbell (dcamp@cs.rice.edu)
http://www.crpc.rice.edu/CRPC/research/fortran.html