Canada’s most powerful research supercomputer was unveiled today at the University of Toronto – where it will be available to researchers of all disciplines across the country.
Niagara is a massive network of 60,000 cores – the equivalent of roughly 60,000 powerful desktop computers – that can be tasked to work together simultaneously on a single, humungous problem. It is funded by the Canada Foundation for Innovation, the Government of Ontario and U of T.
“Niagara is a major leap forward in advanced computing power for scholars engaged in cutting-edge, big data research from aerospace and astrophysics to health research and machine learning, and increasingly the social sciences and humanities,” said Vivek Goel, U of T’s vice-president of research and innovation.
The new system is the first major upgrade at SciNet, U of T’s high-performance computing division, since 2008 – and it is 10 times more powerful than its predecessor. Niagara’s 1,500 servers or nodes, which contain 40 cores each, provide more than three petaflops of processing power, supported by 12 petabytes (12 million gigabytes) of storage. Connecting the servers is a high-speed highway of more than 40 kilometres of fibre optic network cables. It’s this network that differentiates Niagara from other systems in Canada and makes it possible to use all 60,000 cores together to solve a single problem simultaneously.
“Advanced research computing is the backbone of innovation,” Goel said. “Niagara will give researchers the compute power they need to study and find solutions to some of the world's biggest challenges.
“Researchers will scale research projects to the magnitude of compute power available, and Niagara will be a resource for Canada’s brightest minds, not just at University of Toronto, but across the country."
Along with becoming more powerful, Niagara also needs less power to operate – a savings roughly equivalent to the amount used to power 300 average family homes.
This type of setup, known as a large parallel system, is the only one in Canada and is housed in a secure, non-descript location in Vaughan, Ont.
“Access to such a powerful system, believed to rank among the top 50 supercomputers in the world, is also crucial to help us train – and retain – much-needed, highly qualified personnel,” Goel said.
For Niagara’s first test, U of T’s Richard Peltier is running a “heroic calculation” – one that is expected to shed new light on how the world’s oceans physically function.
It’s unknown how long it will take the more than $18-million machine to crunch the millions of gigabytes in real-time data streaming to it now from the ocean bottom of the Pacific.
“It’s never been done before,” said Peltier, a globally renowned climate change expert. “It could be days or even a week depending on the spatial resolution we decide to work at.”
The SciNet team has already started feeding the data for Peltier into Niagara. He came up with the idea of running a heroic calculation on it after discussing with colleagues how best to strenuously test the power of the large parallel system.
“By devoting the entire machine, not only a portion of it, to this one calculation – that’s why it’s ‘heroic,’” said Peltier, a U of T University Professor of physics and scientific director of SciNet. “This is pure, curiosity-driven research. We hope the results will warrant publication and be a major coup for Niagara.”
The calculation will be done in partnership with researchers at the University of Michigan and the Jet Propulsion Lab at Caltech.
Running a similar calculation on the old SciNet supercomputer would have taken roughly 20 times longer.
“The U of T supercomputer is extremely important to this work,” said University of Michigan oceanographer Brian Arbic. “It is a very large and cutting-edge machine. We would not be able to do this calculation right now without access to it.”