Fairbanks, Alaska - The Arctic Region Supercomputing Center (ARSC) is gearing up for its 10th anniversary with significant upgrades to all of its systems. Using these new visualization, storage and computational resources, scientists will be able to move through a tsunami and experience its run-up, quickly store and retrieve massive amounts of data and run computations faster than ever before possible.

Visualization Upgrades

In spring of 2003, ARSC staff will unveil the center’s first fully-immersive virtual environment, a Mechdyne MD Flex™ system. The three walls and single floor three-dimensional displays that make up the system allow scientists to virtually explore data without the constraints of traditional two-dimensional displays. This ARSC Discovery Lab is currently being configured in the University of Alaska Fairbanks (UAF) Rasmuson Library, located on the main UAF campus. It will be available to researchers, scientists and students at the university to explore virtual environments and conduct research in areas like computer-user interfaces, tsunami inundation and the aurora borealis, as well as to work in other forms of creative expression such as three-dimensional animation. The Discovery Lab will supplement existing visualization systems at ARSC including the virtual reality ImmersaDesk. UAF students will be able to explore the Discovery Lab through classes in virtual reality programming, which will be taught during upcoming semesters by ARSC/Computer Science faculty.

Storage Upgrades

The center recently installed two Sun Fire™ 6800 systems, each with eight, 900 MHz UltraSparc 3™ processors and 10.5 terabytes (TB) of raw disk. The Sun systems will provide ready access to data on ARSC supercomputers, visualization resources and workstations and will ease the burden of working with the large data volumes that many computations produce.

These systems will be connected to the center’s existing StorageTek™ data silos, which have been upgraded to include 6 STK 9840B (20 gigabyte (GB) capacity, 19 megabyte (MB) per second transfer rate) and 4 STK 9940B (200 GB capacity, 30 MB per second transfer rate) tape drives for each Sun Fire™. The robotic tape silos store the massive volume of data generated by ARSC researchers. This more general solution will provide better access and retrieval and can be expanded to accommodate many different computational platforms and data sources.

“These drives will provide a significant boost in total transfer rate to and from the silos as well as a huge increase in capacity,” said ARSC storage specialist Gene McGill.

Computational Upgrades

In May, the center will install the first phase of a 128-processor high-efficiency Cray X1™ parallel vector system, which will be available to users later in the year. The system will be delivered with 512 GB of memory with an option to upgrade when higher density memory becomes available. The X1 will boast a peak performance of 1.6 trillion calculations per second (teraflops) and will eventually replace the center’s current Cray T3E™ and Cray SV1ex™ systems.

The center will also be adding an integrated architecture IBM supercomputer system, composed of two IBM eServer p690 systems, each with 32 processors and 256 GB of memory and IBM eServer p655 systems with two GB of memory per processor. This system will be expanded with additional next generation IBM eServer p655 systems and integrated with IBM's next generation clustering technology to bring the overall peak theoretical performance of the supercomputer to five-teraflops. The last phase of the installation will be in the fall of 2003.

The Cray and the IBM systems will each provide unique hardware and software capabilities that will allow ARSC users to tackle complex problems in a variety of fields. Currently, researchers use ARSC resources to solve problems in bioinformatics, global climate change, space physics, ocean circulation, galactic formation, computational fluid dynamics and arctic engineering.

“ARSC continues to move forward in providing its users with the best possible resources with which to explore data and create new understanding,” said ARSC director Frank Williams. “These tools will allow our researchers to continue finding solutions to the important questions of today and the future. We are anxious to launch into the next 10 years of computational science.”

About ARSC

The Arctic Region Supercomputing Center, located on the campus of the University of Alaska Fairbanks, supports computational research in science and engineering with emphasis on high latitudes and the Arctic. The center provides high performance computational, visualization, networking and data storage resources for researchers within the University of Alaska, other academic institutions, the Department of Defense and other government agencies. ARSC is a Shared Resource Center in the Department of Defense's High Performance Computing Modernization Program.

Events

ARSC Office has Relocated

As of November 3, 2014, the ARSC office has moved to the Elvey Building, suite 508, on the UAF campus. ARSC staff and User Support are available Monday through Friday 8am to 5pm in Elvey 508 . The physical address for the new location is 903 Koyukuk Drive. Phone numbers, email addresses, and all other ARSC services have remained the same. The HPC clusters and archival storage silo will remain in the Butrovich Computing Facility.

Connect with ARSC

The University of Alaska Fairbanks is an affirmative action/equal
opportunity employer and educational institution and is a part of the University
of Alaska system.
Arctic Region Supercomputing Center (ARSC) |PO Box 756020, Fairbanks, AK 99775 | voice: 907-450-8602 | fax: 907-450-8601 | Supporting high performance computational research in science and engineering with emphasis on high latitudes and the arctic.
For questions or comments regarding this website, contact info@arsc.edu