Mastering the Challenge of High-Performance Computing

Mastering the Challenge of High-Performance Computing
But there are perils in ignoring developments in the rising field, scholars say
By Ronald Roach

Not long after her appointment as Hampton University’s first chief information officer in 1999, Debra S. White, a former IBM executive, found that in addition to managing a campus IT network upgrade she had to contend with demanding scientists in programs, such as physics and atmospheric sciences, conducting nationally acclaimed and highly advanced research.

“We spent time listening to them, and I became aware that we had to provide better research tools in terms of our computing infrastructure,” she says.

In time, Hampton officials made even more campus network upgrades, and the school gained a level of high-speed Internet connectivity, known as DS-3 or 45 megabits per second, which qualified them to be considered part of the elite Internet2 community of research schools. “You have to provide researchers the best tools,” White says.

Over the past several years, college and university administrators have laboriously tried to keep up with information technology innovations in administrative systems, teaching and learning practices, online education, and fast and convenient campus Internet access. In the midst of rapid change, institutions have coped by placing central authority in the hands of chief information officers, and have recognized the strategic importance of information technology in making a school attractive to students and faculty.

Equally demanding, if not more so than the administrative and the teaching and learning applications in computing, has been science and technology research. Just as all of higher education got serious with wiring individual campuses for the Internet, the nation’s leading research universities in association with national supercomputing centers have begun generating an entirely new set of computing tools and functions for computers in research.

The major schools and research centers have set the stage for what is known as “high-performance computing,” which refers to computing tools and processes capable of generating knowledge at the frontiers of science. “The demand for sophisticated cyberinfrastructure is exploding in every field of science and engineering. Teams of researchers working within and across disciplines are coming together to lay the foundations for a cyberinfrastructure revolution,” said Dr. Rita Colwell, director of the National Science Foundation in a recent speech at a supercomputing conference.

“I believe we stand on the threshold of a new age of scientific exploration, one that will give us a deeper understanding of our planet and allow us to improve the quality of people’s lives worldwide…. For decades, NSF has been steadily crystallizing the idea of a center that brings together diverse skills, tools and perspectives to focus laserlike on scientific and technological problems. From this (came) the original science and technology centers, the engineering research centers and the supercomputing centers,” she added.

For smaller colleges and universities that are less research-driven and more teaching-oriented than bigger schools, the move into high-performance computing represents a considerable challenge in how these institutions will develop their information technology infrastructures. Academic leaders say there is peril in ignoring developments in the high-performance computing arena. One of the biggest worries is that faculty will miss out on opportunities to improve their teaching if they fail to learn innovative research techniques afforded by advanced computing.

“We have to provide the benefits of high-performance computing in the curriculum,” says Dr. Joyce F. Williams-Green, the chief information officer at Winston-Salem State University (WSSU) in Winston-Salem, N.C.

For institutions that are determined on not falling behind, there has been some effort by the supercomputing community and the large research schools to reach out and share resources and knowledge. In fact, some of that outreach activity has targeted minority-serving institutions, and is helping a few historically Black schools develop highly competitive research and academic programs.

“It’s a door the (minority-serving institutions) can walk through,” says Dr. Allison Clark, assistant director of digital equity initiatives at the National Center for Supercomputing Applications in Champaign, Ill.

Technologies in Wide Use

At WSSU, Williams-Green has led efforts to bring high-performance computing technologies and practices into wide use at her school. While taking a leadership role in shaping the Winston-Salem community into a major biotechnology research and business center, the historically Black liberal arts institution has aggressively sought training for its faculty as well as inclusion in supercomputing and high-performance computing activities.

At Clark Atlanta University, Dr. John S. Hurley, an electrical engineering professor and former chief information officer, says that by the late 1990s, high-performance computing represented a natural step for the university since researchers there had experience working on complex computing projects. Since the early 1990s, researchers had been doing considerable work in association with the U.S. departments of Defense and Energy, and the National Aeronautics and Space Administration (NASA). He says that top administrators were willing to move further into high-performance computing when it was shown campus network improvements would prove beneficial to administrative systems.

To make high-performance computing a reality, Williams-Green, Hurley and other administrators at historically Black schools have taken advantage of the outreach by a joint effort of two NSF-funded partnerships; the National Partnership for Advanced Computational Infrastructure in San Diego and the National Computational Science Alliance in Champaign, Ill. The effort is known as the Education Outreach and Training Partnerships for Advanced Computational Infrastructure (EOT-PACI).

Since 1998, EOT-PACI has been spreading the benefits of supercomputing and high-performance computing practices to historically Black schools, Hispanic-serving institutions and tribal colleges. In late 1999, EOT-PACI got a boost when it was awarded $1 million by the NSF to participate in the four-year $6 million EDUCAUSE-administered Advanced Networking Project with Minority-Serving Institutions (AN-MSI). The award tied EOT-PACI’s activities in a partnership with EDUCAUSE, the nation’s leading higher education information technology association, and minority organizations, such as the Executive Leadership Foundation, the National Association for Equal Opportunity in Higher Education, the Hispanic Association of Colleges and Universities and the American Indian Higher Education Consortium.

“The goal of EOT-PACI was to reach out to more institutions and people to help build infrastructure at places, such as the minority-serving schools. The AN-MSI outreach by EOT-PACI has focused specifically on getting the minority-serving institutions to participate in high-performance computing activities,” says Stephenie McLean, the program manager of the EOT-PACI AN-MSI project.

High-Performance Computing Fundamentals

For the past three summers, Dr. Jill Harp, a chemistry professor at WSSU, has attended workshops to learn a new science — computational science. The workshops have been preparing Harp to use computers and computer modeling in her teaching and research. Harp recognizes that computational science is becoming a necessary part of her discipline.

“When you go into pharmaceutical companies you’re expected to know computational science. It’s becoming a standard part of the chemistry textbooks,” she says.

The capacity to conduct advanced computational work, which involves creating mathematical models to simulate scientific processes, often requires powerful computers, access to large databases and Internet connections that link researchers from one institution to another. The growing practice of computational science in science and technology research is a driving force behind higher education’s push into high-performance computing.

“Computational science is going to be ubiquitous. Computing is becoming more and more integrated into the practice of science and engineering,” says Dr. Roscoe Giles, the co-chair of EOT-PACI and a Boston University computer engineering professor.

Other basic ideas and tools that define high-performance computing are:

• Cluster computing — The practice of stringing together many small computers to create the capacity of an expensive supercomputer. EOT-PACI has conducted several workshops to train faculty and researchers in the AN-MSI project on how to set up cluster computing networks. The practice is seen as a way resource-strapped schools can participate in projects that would have been too expensive in the past.

• Grid computing — A network of linked computers that share power, storage, applications and other resources over the Internet. A number of historically Black schools have acquired what is known as Access Grid node capacity. Having an Access Grid node allows an institution to have an interactive and collaborative teleconference system with multiple simultaneous connections to other sites. It also facilitates high-level interaction and research collaboration with research institutions.

• High-speed connectivity — The ability of a school to carry out high-performance activities often relies upon the speed and the bandwidth of its campus network. The minimum standard for connectivity associated with high performance computing is DS-3, or 45 megabits per second.

Currently, the membership of the Internet2 consortium, which is led by 202 universities working in partnership with industry and government to develop and deploy advanced network applications and technologies, represents leading schools in high-performance computing. Of the 202 schools considered full members of the Internet2 consortium, only Jackson State University and Florida A&M University are historically Black. Several other HBCUs, including WSSU, Hampton and Clark Atlanta, are part of the Internet2 community as members whose access high-speed access is sponsored by a full Internet2 member.

“We partnered with Virginia Tech and came in as an associate member. We get full access and we’re fully engaged with other Internet2 schools,” says Hampton University’s White.

Future Focus

There’s been some discussion over whether schools that don’t acquire and develop high-performance computing capabilities will be consigned to being a digital divide “have not” in higher education. So far, a number of officials are optimistic that advanced computing tools and capabilities will see wide dispersion across higher education. Much of that dispersion will require the leading schools to share knowledge and resources with others.

“We’re hoping the torch will be passed on through autonomous efforts by the individual schools,” says Giles of EOT-PACI.

Clark Atlanta’s Hurley says that the HBCUs that have taken a lead in the high-performance computing arena have a responsibility to help other HBCUs.

“We owe it to the HBCU community to take the lead on this issue. We wouldn’t have gotten to the table were it not for the large research schools and the supercomputing community reaching out to us,” he says.

© Copyright 2005 by