Panel Hopes to Splice Pieces Of U.S. Research Network
From: Science Magazine, March 7, 1997
By Jeffrey Mervis
For computer scientist Ken Kennedy, the search for high-end performance never ends. As director of the Center for Research on Parallel Computation at Rice University, one of the National Science Foundation's (NSF's) showcase science and technology centers, he has direct access to powerful IBM, Cray, and Intel supercomputers. But he also needs high-speed connections to researchers in other institutions. So last summer, Kennedy's center joined with two local universities in a successful bid to hook up to NSF's very high speed Backbone Network Service (vBNS), a network initially created co link five NSF-supported supercomputing facilities. The center also is competing for a major role in the next iteration of NSF's supercomputing program (see sidebar).
These programs are allowing researchers like Kennedy to redefine what it means to be connected. Their scientific needs have long since exceeded the capabilities of the Internet, the once-proud federal creation that has become a victim of its own popularity and led them to hook onto more capable networks like vBNS to share and manipulate vast amounts of data. But even these high-speed networks have their limitations: Access is costly and limited (see table), and in general, the networks don't connect onto each other. Fortunately for Kennedy and other data-hungry researchers, two separate new initiatives have emerged to help lead the way toward an even more connected future.
One is a 5-year, $500 million program, called the Next Generation Internet (NGI), that President Clinton announced with great fanfare last October in the heat of the election campaign (Science, 18 October 1996, p. 335). The second is a loose-knit university initiative, called Internet-2, to upgrade campus networks and develop educational applications that make use of these improved links.
Although these initiatives and existing high-speed agency networks have sprung up independently, all are integral to creating the next U.S. information highway. Larry Smarr, director of the NSF-funded National Center for Supercomputing Applications, likens NGI to the top of a three-layer cake. Internet-2 provides the foundation for universities to take advantage of improved networking, he says, while individual agency programs like vBNS sit in the middle, generating the applications that will create demand for the high-end systems. "The president has said chat NGI is part of his bridge to the 21st century," says Smarr. "And the middle layer will produce the success stories to justify the cost of Internet-2."
The White House's direct involvement with NGI is likely to make it the most visible of these efforts. In the 5 months since Clinton announced it, federal officials have been beavering away at a plan. They are talking about giving 100 sites around the United States a connection 100 times more powerful than what is now available on the Internet and wiring 10 sites with 1000 times the present capacity. The Defense Advanced Research Projects Agency, with $40 million requested for I998, will get the biggest share of NGI money, followed by a proposed $35 million for the Department of Energy (DOE) and $25 million divided among NASA, NSF, and the National Institute of Standards and Technology. Despite it name, NGI will serve more as a showcase for new technology than an early vision of the next Internet. "It's not the start of a commercial network," says DOE's Dave Nelson. "It's a test-bed for what can be accomplished with greater capacity and innovative uses. Then it's up to industry co make available what the customer wants."
The task of sketching our a new information superhighway will fall to a presidential panel of 22 university and industry bigwigs, which met for the first time last week. A group led by Carnegie Mellon University computer scientist Raj Reddy and Microsoft's Jirn Gray is hoping to complete a report on NGI by June, while two other subcommittees - one to examine high performance computing and the other to address information management - are being formed to develop recommendations before the panel goes out of business in 2 years. Not surprisingly, most of the people at the table are already involved in such efforts, including the roan running the meeting‹Kennedy.
The panel's official name - the Advisory Committee on High-Performance Computing and Communications, Information Technology, and the Next Generation Internet‹ reflects how much ground it has been asked to cover. One major challenge is to meet the needs of society as even as research institutions, its initial focus. "If only the top researchers have a clear pipe and the rest of society has clogged pipes, there will be a revolt," says Smarr, a panel member.
The panel will also focus on building a more robust and useful network than she old internet "In 20 years, the average person will be able to access a petabyte (I0^15 bytes) of information for $100," predicts Reddy. 'That's equivalent to all the printed material that's ever been created. But what will they do with all that information? And will it work reliably, without rebooting?"
Out of the dirt. While the panel deliberates, select groups of researchers have already found relief from what Smarr calls the "one-lane dirt road" and "cyber-sewer" that the Internet has become as a result of ire exponential growth. Help has come in the form of specialized networks set up in the past few years by the four federal agencies with the largest stake in high-end computing - NASA, NSF, and the Departments of Defense and Energy. But while they give government researchers and outside scientists funded by those agencies high-Speed access to facilities within the network, they typically rely on the clogged Internet to connect to other networks. The result is enhanced computing for a relative handful of researchers working on selected projects.
"Before we connected to vBNS, we had a standard Internet connection," says University of Pennsylvania physicist Robert Hollebeek, co-director of the NSF-funded National Scaleable Cluster Project at Penn, The University of Maryland, and the University as Illinois, Chicago. The 2-year-old project, which hopes to provide researchers at the three institutions with access to high-end computing from their desktops, was one of 13 projects chosen by NSF in a first round of competition last summer to expand vBNS. "You can't do high-volume, high-speed work without it," says Hollebeek. NSF expects to announce another round of winners next month from a pool of 50 proposals, on its way toward linking l00 institutions.
But even vBNS has its limitations. One is its high cost: Hollebeek's $350,000 grant pays for the special equipment needed to connect to the nearest node, which is 65 miles away and for the monthly long-distance phone bills. It's also not for everyone: Researchers on the three campuses must apply for the chance to be hooked up and show a "legitimate reason" to use the greater bandwidth and speed, he says.
Penn is also participating in Internet-2, which was begun last fall. The consortium, which has tripled in size since 34 universities founded it, has pledged to spend about $500 million over the next 3 to 5 years to upgrade campus networks and to develop applications that make use of broadband capacity. Members also hope to link to each other through one or more of the existing high-end networks.
Mike Roberts of the Washington-based EDUCOM, who is directing the project, says some applications - multiple media, interactivity, and real-time collaborations, for example‹would be useful to all researchers, while others, such as distance education and lifelong learning are particularly important to universities. "We're focusing on the average faculty member who says [the current network] is too hard co use or too slow for research and teaching," says Roberts.
The new presidential panel is expected to make recommendations on achieving a seamless fit among these emerging networks. 'We want to help them achieve a balanced portfolio among seeding new hardware, developing the appropriate software, and conducting research on new applications," says Kennedy. 'We're looking 10 to 15 year down the road." If they succeed, thousands of researchers may find themselves in the driver's seat as they head down the next information highway.