Frequently Asked Questions
Note: The CRPC ceased operations in 2000. These pages are retained for archival purposes.
What is the CRPC's Mailing Address and Phone Number?
To contact CRPC headquarters by mail, write to
Where can I find directions to CRPC Headquarters at Rice University or the Houston Plaza Hilton?
When was the CRPC formed, and what is its main goal?
The CRPC was established in 1989 to make massively parallel computing systems as usable as conventional supercomputing systems are today.
What institutions make up the CRPC?
The CRPC is a consortium that includes seven core sites
What are some of the CRPC's major accomplishments?
The Center for Research on Parallel Computation (CRPC) has
Where can I find out about related efforts?
The CRPC also has affiliations with other research centers and coalitions:
What is parallel computing?
Parallel computing takes hundreds or thousands of microprocessors (the "brains" behind computers) and makes them work in parallel on a single computing task. These microprocessors can be linked together in a single computer or can be housed separately in computers that are linked together on a network.
What is parallel computing's basic advantage?
The advantage of parallel computing over traditional, single-processor computing is that it can tackle problems faster and with greater power. An analogy would be the advantage of using multiple washing machines at a laundromat over using a single washer at home. With multiple washers, you can handle a larger number of loads in a shorter amount of time.
How will the CRPC help make parallel computing usable?
For the past 30 years, programmers have developed computer software and operating systems that exploit the use of one processor for a task. For parallel computing to work, however, software and operating systems need to be rethought and redeveloped in the context of using multiple processors working together. Standards also need to be developed to ensure that parallel computing users can achieve software performance independent of the machine that they are using. Science and engineering students and current supercomputing users need to be trained in the use of new parallel tools and methods. The CRPC is addressing these areas and others to make parallel computing truly usable at the software level.
What challenges lie in making parallel computing usable?
The next generation of parallel computers will achieve more than three trillion floating-point operations per second (three teraflops). Unfortunately, several factors have hindered users from exploiting this potential:
How can I learn more about parallel computing?
The CRPC has a number of publications that discuss various aspects of parallel computation and the people who work in this field, including
How can I search the CRPC Web site?
You can use our Harvest broker interface to search the CRPC Web Site.
How the CRPC Is Making Parallel Computing Usable
Software and programming support are widely recognized as the final challenges to making high-performance computing and communications (HPCC) an economically successful industry. The CRPC is the only research center in the nation devoted solely to meeting these challenges.
CRPC researchers have already developed several influential technologies to make parallel computing truly usable: parallel versions of common programming languages, technologies for making different computers work together, parallel versions of common science and industry applications, and "templates" that enable scientists and engineers with limited programming experience to develop their own customized parallel programs.Many small to large-size companies have successfully capitalized on CRPC research. Convex Computers, a $200 million/year public corporation reported that "CRPC technology probably represents about one million dollars in research effort to our company." The CRPC has also fostered standards between hardware companies that allow software companies to develop programs regardless of the machine on which they are run. For example, the CRPC led an industry forum to develop High Performance Fortran, a standard parallel language that has spun off product development at more than 20 HPCC companies.The CRPC's work allows American businesses to more accurately and quickly test and design new products, and analyze information for petroleum exploration, environmental clean up, and health care management. For instance, in Texas alone, widespread use of CRPC technologies can potentially improve annual oil production by 300-500 million barrels per year and provide between 60,000-80,000 jobs. CRPC technologies also permit scientists to tackle problems considered unmanageable by conventional computing.The CRPC is demonstrating how high-performance software can be effectively exchanged, reused, and shared between universities, research laboratories, and industry. The CRPC's National HPCC Software Exchange, an online software distribution system, provides a central Internet access point for HPCC technologies located around the nation. Also, CRPC training courses for supercomputer center staff are leveraging the center's effort to reach the maximum number of supercomputer users.The CRPC is training a new generation of scientists and engineers -- particularly women and underrepresented minorities -- to be well-versed in the use of parallel computing. CRPC researchers are providing post-secondary students with valuable research experience, and developing textbooks, curricular materials, and courses, including groundbreaking graduate degree programs in computational science and engineering. CRPC workshops are exposing high school students and teachers around the nation to opportunities in computational science and engineering, and providing curricular support for a model K-8 school.
Additional questions?Please direct additional questions about the CRPC to firstname.lastname@example.org.
Sites & Affiliations | Leadership | Research & Applications | Major Accomplishments | FAQ | Search | Knowledge & Technology Transfer | Calendar of Events | Education & Outreach | Media Resources | Technical Reports & Publications | Parallel Computing Research Quarterly Newsletter | News Archives | Contact Information