The University of Tasmania is investing in a new high-performance computer cluster (HPC) to meet the needs of a research thrust that is becoming increasingly data intensive.

University Acting Deputy Vice-Chancellor (Research) Professor Clive Baldock said the investment – which will include hardware and data centre facilities – will strengthen the University’s research effort to solve urgent scientific problems in existing strengths, including climate and ocean sciences.

It also will enable the University to meet emerging demand in fields such as engineering, health and medical research, with a focus on genomics.

The HPC system will be housed in a new purpose-built research data centre at the University’s Sandy Bay Campus, with supplier Huawei to provide all equipment, racks, cables, and associated accessories.

The facility will support applications using more than 7,000 Central Processing Unit (CPU) cores, exposing researchers to the scaling problems of large CPU count jobs and providing them essential experience if they are wanting to migrate workloads to larger facilities such as the supercomputer at the National Computational Infrastructure (NCI) facility in Canberra.

“The traditionally complex HPC facilities will now be significantly more accessible to a wider group of researchers due to many new available technologies, such as hybrid HPC-cloud and next generation machine-learning applications,” Professor Baldock said.

This new HPC purchase is part of a rolling program by University of Tasmania to round out a portfolio of eResearch services providing cloud-compute, HPC-compute, and research data storage for Tasmanian researchers and their collaborators.

The overall design can accommodate a tripling in size of the HPC environment when required.

TPAC recently launched 4 compute nodes with the new Intel Phi Processor, code name ‘Knights Landing’ (KNL). This gives local researchers the ability to use 256 cores of the new architecture.

Intel Xeon Phi KNL processors feature the second generation of Intel’s Many Integrated Core (MIC) architecture. Each processor features integrated on-package memory for higher memory bandwidth and include up to 72 cores based on the Intel x86 architecture.

Each CPU core has 2 x AVX-512 vector units (512-bit), so they can do a large number of parallel operations each cycle. KNL also has 16GB of MCDRAM (multi-channel DRAM), which runs at over 400GBs (this is around 4-5 times faster than an equivalent Xeon CPU).

A single-socket KNL CPU easily delivers around 2TFLOPs (or 3TFLOPs for the higher-clocked variants), so they’re quite fast. Each CPU has 4-threads per CPU so a single socket delivers 256 threads of execution. KNL CPU’s deliver this with clock speeds well under traditional Xeon’s frequencies.

The KNL processors will be critical in taking research requiring fast throughput to RAM to the next level, and TPAC will be assisting researchers in code adjustments to talk advantage of these new CPU’s.

TPAC can’t wait to see what this new architecture will allow researchers to achieve.

Cloud Computing is fast becoming the most effective way for researchers to use “commodity” computing and storage resources for their eResearch needs. The NeCTAR Research Cloud (built on OpenStack) delivers these resources freely to researchers throughout Australia. This environment allows research to grow and expand beyond desktop and local infrastructure, enabling truly global research collaboration, expandable and adaptable to meet research needs.

To assist researchers in making the most of the Research Cloud TPAC hold regular Nectar Cloud Starter workshops that run for one day as a number of tutorials. These workshops have enabled researchers to use this resource to assist their research where required.

TPAC have already enabled hundreds of users to gain access to the cloud, and hold workshop sessions regularly and there are a number of workshops with vacancies scheduled over the next few months. If you would like to find out more, check out our Training Page.

Researchers are often time-poor, and TPAC understands how important it is to offer services to researchers that are familiar, have been pre-configured with research tools and are quick to deploy. With this is mind TPAC have created a number of cloud based Virtual Machines (VM’s) that run within the National Research Cloud (NeCTAR).

By using the NeCTAR Cloud researchers can quickly deploy VM’s right around the country, where the demand is needed. Researchers can quickly link the VM to local services and data. Deployment is managed via a web browser and researchers can use them as a virtual desktop on any PC. It’s like having an additional PC anywhere in the country, always on and can be shared with colleagues. This sharing aspect is critical to working in the collaborative world of science.

VM’s offers many advantages over traditional stand alone PC’s or Servers, and these advantages suit the research workflows very well. VM’s can be quickly deployed, copied, moved, shared and deleted. These capabilities of VM’s is great for collaboration, testing and proof of concept work. VM’s are not one size fits all and can be scaled and customised to suit the requirements, or modified as needs change.

The response to these VM’s has been very positive:

‘We used the x2go server whilst our visiting academic was in Australia, and it was AWESOME. It just worked wonderfully. You’ve done a great job setting up the system. From install, setup and subsequent usability it was great. It allowed the use of an interactive data analysis language and GUI they had experience with, but with direct access to the vast datasets we have on-site. They were up and running within half a day, which was great, as they were only here for 4 weeks. On top of this, it is all within the ‘NECTAR cloud’ so they can access it on their return home so we can continue our collaboration. Truly wonderful’ – Dr Tom Remenyi (ACE CRC).

TPAC have created setup and support documentation for new and existing NeCTAR Cloud users for our general remote desktop image as well as two popular environments, MatLab and R-Studio. Please note that due to licensing restrictions the Matlab VM will not work beyond the UTAS infrastructure.

TPAC will be also providing free How-To workshops over the next few months (we will be initially offering workshops to researchers based at the University of Tasmania, the CSIRO Hobart offices and the Australian Antarctic Division). If you are interested in learning more about the NeCTAR Cloud and what it can offer your research or placing your name down for a spot in a workshop, then please contact us.

One of the many services that TPAC offers to research community is MARVL, The Marine Virtual Laboratory. The following is a brief Introduction to MARVL, how it can be accessed, and what you can do with it.

Tasmania continues a 20-year tradition at the crest of the nation’s research computing capacity with the launch today of the Tasmanian eResearch Cloud.

The cloud will store important scientific data collections and also allow researchers to access enormous computing power from a standard desktop PC.

The $8.75 million project is the result of a collaboration between the University of Tasmania, CSIRO and Australian Antarctic Division. It is supported by both State and Federal governments.

It includes the Tasmanian node of the National eResearch Collaboration Tools and Resources Project (NeCTAR) research cloud, which will be one of eight nationally, federally funded by the Education Investment Fund.

University Vice-Chancellor Professor Peter Rathjen said the cloud was important in many ways, including its demonstration of the power of partnerships.

“None of the partners in this collaboration could have achieved this in isolation, but together we have accomplished something quite remarkable,” Professor Rathjen said.

“Infrastructure such as this is central to the University’s mission of being a research intensive institution and one of global significance.”

Professor Nathan Bindoff is a leading academic at the University of Tasmania’s Institute for Marine and Antarctic Studies. He is Director for the Tasmanian Partnership for Advanced Computing and led the development of the research cloud proposal.

The NeCTAR cloud comprises thousands of processors, each of which can run data analysis and simulations in parallel.

“Simulations and analysis of big data can take a long time,” Professor Bindoff said.

“Using the NeCTAR cloud, if you want to analyse something in 100 different ways, you can do that simultaneously.

“This will allow a significant acceleration in the amount of research we can conduct and the time it will take to do it.”

Investment in the Tasmanian node of NeCTAR was about $1 million which is, coincidentally, about the purchase price for a supercomputer the University bought in collaboration with the CSIRO in 1995.

At the time of purchase that computer was ranked in the world’s top 500, with 12 processors delivering 1.2 gigaflops (as measure of the speed at which the computer operates). Those specifications pale in comparison to the Tasmanian node, which has 2688 processors capable of more than 13,000 gigaflops.

“The facility allows researchers scale their analysis up as required, all on the fly. That’s the dynamic world we have entered with the research cloud,” Professor Bindoff said.

“What it means whether we are talking chemical science, or climate work or oceanography, we have the capacity to be competitive.”

NeCTAR is an Australian Government project established as part of the Super Science initiative by the Commonwealth of Australia, Department of Education and financed by the Education Investment Fund.

The Climate Futures Project, supported by the NERP Landscapes and Policy hub (http://www.nerplandscapes.edu.au/), has developed fine-scale regional climate projections for the Australian Alps (Climate Futures for the Alps, or CFA). These unique regional climate projections will help researchers and land managers with the conservation of biodiversity.

The projections have been generated on TPAC’s high performance computing facilities and are now available TPAC’s data service facilities at the University of Tasmania’s data centre, with the support of RDSI. For access to the CFA data you can go to the CFA data access service here.

In consultation with land managers and hub researchers, Climate Futures Project have generated ecological indices that can be used with multi-model projections of future climate in the Australian Alps. Using the new climate projections, researchers, planners and managers will have a greater capacity to explore the likely implications of climate change on priority species, communities and threatening processes.

Any Australian with a home computer and an internet connection can now power up a climate model and help scientists find the causes of record high temperatures and drought that hit Australia and New Zealand in 2013.

The online climate experiment, Weather@Home has been created by a group of scientists from the University of Melbourne, the ARC Centre of Excellence for Climate System Science, University of Oxford, the UK Met Office, the University of Tasmania, and the National Institute of Water and Atmospheric Research in NZ.

By signing up to Weather@Home, computer users can create climate model simulations that produce 3D representations of weather for 2013. They can watch these evolve in real time or let them run quietly in the background.

TPAC is currently hosting the services that home computers communicate with; It will provide new data for thousands of home users for processing, as well as provide services to store the resulting simulations from home computers.

TPAC with funding provided by Research Data Storage Infrastructure (RDSI) will store the processed simulations for use by the research community and could be used to assess the possible role of climate change in Australia’s Black Saturday bushfires in 2009, the record rain events in New Zealand in 2011 and the record rain events in eastern Australia in 2010 and 2011.

MARVLIS (Marine Virtual Laboratory Information System) has now been completed. MARVLIS is an “add-on” or a value added software package that provides a number of tools that can be used by the Marine Research Community.

If you would like to learn more about MARVLIS, then please visit the MARVLIS blog. MARVLIS has been produced with help from ANDS (Australian National Data Service), and the CSIRO.