New computing cluster boosts UTM research power

A new high-powered computing cluster is boosting data handling capabilities for researchers at U of T Mississauga. Launched in February, the UTM Research Cluster provides high-power data processing capabilities, improved security, data storage and backup, and centralized software and hardware administration, along with substantial cost and time savings to researchers and departments analyzing large volumes of data.

“Almost all facets of university research require computation,” says Professor Bryan Stewart, Vice-Principal, Research at UTM. “Whether it be scholarship in digital humanities, crunching big sets of economic or census data, image processing of satellite and drone-acquired data, structural data of proteins or bioformatics analysis, computational power is required in all disciplines.”

The Information & Instructional Technology Services initiative began two years ago at the request of researchers who sought increased capabilities in handling large-scale computations. Systems analyst Brian Novogradac worked closely with administrative teams at SciNet and Dell to create the cluster, which boasts five servers and has the capacity for future expansion. “It has 340 core systems with close to three terabytes of RAM, amounting to 1.3 CPU teraflops of computing power,” he says. “It’s super-fast.”

The research cluster runs LINUX-based software programs such as Python, R and Trinity, with the ability to run multiple versions of the programs or other software as needed. “There’s a lot of testing in the back end, which we handle, so researchers can just go ahead and use it,” Novogradac says. “This takes the work off their desktop computers and moves the data into a central, up-to-date system with secure storage and increased power. It’s also a shared resource, so the system doesn’t sit idle.”

The system, which can be accessed by UTM researchers and graduate students with supervisor permissions, provides a middle ground between desktop systems and supercomputer consortiums like SciNet and Niagara, which serve regional computing demands for data processing. “We are using the same software and similar procedures as SciNet, so researchers have the option to scale up if they need more power,” Novogradac says.

For researchers, it’s a boon to have computational power and administrative support on campus. “What we do is dynamic,” says UTM cell and systems biologist Adriano Senatore. “It can be challenging to access supercomputer clusters—you submit a job to the queue and wait for days or weeks for your turn, only to have analysis halted because of an error. It can waste a lot of time. With the new cluster, if we want to try a new algorithm, we now have expertise on site to implement it into the system right away. This is a lot more efficient.”

“This is a very attractive feature to new faculty recruits,” Stewart adds. “Having easy access to a mid-level cluster that is customizable, expandable and with a high-level of expert support is a great advantage for anyone, not just new faculty, looking to improve their computations.”

Over the next six weeks, the I&ITS team will hold information and training sessions to introduce the cluster to departments across the campus. Staff and faculty are also invited to enter the naming contest for the computing cluster. The contest closes March 29, 2018.