Mastering the Challenge of High-Performance Computing
But there are perils in ignoring developments in the rising field, scholars say
By Ronald Roach
Not long after her appointment as Hampton University’s first chief information officer in 1999, Debra S. White, a former IBM executive, found that in addition to managing a campus IT network upgrade she had to contend with demanding scientists in programs, such as physics and atmospheric sciences, conducting nationally acclaimed and highly advanced research.
“We spent time listening to them, and I became aware that we had to provide better research tools in terms of our computing infrastructure,” she says.
In time, Hampton officials made even more campus network upgrades, and the school gained a level of high-speed Internet connectivity, known as DS-3 or 45 megabits per second, which qualified them to be considered part of the elite Internet2 community of research schools. “You have to provide researchers the best tools,” White says.
Over the past several years, college and university administrators have laboriously tried to keep up with information technology innovations in administrative systems, teaching and learning practices, online education, and fast and convenient campus Internet access. In the midst of rapid change, institutions have coped by placing central authority in the hands of chief information officers, and have recognized the strategic importance of information technology in making a school attractive to students and faculty.
Equally demanding, if not more so than the administrative and the teaching and learning applications in computing, has been science and technology research. Just as all of higher education got serious with wiring individual campuses for the Internet, the nation’s leading research universities in association with national supercomputing centers have begun generating an entirely new set of computing tools and functions for computers in research.