To break in Canada’s newest, most powerful research supercomputer, the University of Toronto’s Richard Peltier is running a “heroic calculation” – one that is expected to shed new light on how the world’s oceans physically function.
It’s unknown how long it will take the more than $18-million machine known as Niagara to crunch the millions of gigabytes in real-time data streaming to it now from the ocean bottom of the Pacific.
“It’s never been done before,” said Peltier, a globally renowned climate change expert. “It could be days or even a week depending on the spatial resolution we decide to work at.”
Unveiled today, Niagara is a massive network of 60,000 cores – the equivalent of roughly 60,000 powerful desktop computers – that can be tasked to work together simultaneously on a single, humungous problem.
This type of setup, known as a large parallel system, is the only one in Canada and is housed in a secure, non-descript location in Vaughan, Ont. It’s open to all Canadian university researchers and is part of a national network of research computing infrastructure.
Up and running for a week, the SciNet team has started feeding the data for Peltier into Niagara. He came up with the idea of running a heroic calculation on it after discussing with colleagues how best to strenuously test the power of the large parallel system.
“By devoting the entire machine, not only a portion of it, to this one calculation – that’s why it’s ‘heroic,’” said Peltier, a U of T University Professor of physics and scientific director of SciNet. “This is pure, curiosity-driven research. We hope the results will warrant publication and be a major coup for Niagara.”
Running a similar calculation on the old SciNet supercomputer would have taken roughly 20 times longer.
The calculation, done in partnership with researchers at the University of Michigan and the Jet Propulsion Lab at Caltech, is attempting to answer a fundamental research question that holds great interest for researchers in a number of fields.
In the 1970s, oceanographers Chris Garrett and Walter Munk famously theorized the world’s oceans are filled with internal waves ricocheting back and forth from the ocean bottom to the surface and predicted the shape of the power spectrum that should be observed in these waves.
The waves are generated by the barotropic tide causing the water in the oceans to slosh back and forth horizontally in response to the gravitational pull of the sun and moon. Their intensity is magnified by bumps along the ocean floor – the bumpier the bottom, the stronger the wave, Peltier explained. When waves break, turbulence is generated and causes friction, which makes the ocean dissipative and “sticky.”
But for more than four decades, scientists have lacked an accurate, high resolution model of the detailed physics of this interaction to actually see whether the theoretical arguments are correct, he said.
To conduct the Niagara calculation, his team of collaborators are using data from ocean sensors called McLane profilers in selected patches of the Pacific Ocean – one near the Hawaiian islands, which has a very bumpy ocean bottom, and one in the open ocean of the central-west Pacific, which has a smoother ocean floor.
This information will then be coupled with atmospheric data to model the formation, intensity and life span of these waves as they dissipate over time.
“I’d like to think that we’ll be able to verify at very high spatial resolution the internal wave spectrum,” said Peltier. “Hopefully we’ll be able to shout, ‘Eureka, we’ve not only seen wiggles, we’ve seen wiggles of the right set of [wave] phase speeds that the ocean should be filled with” – as predicted by Garrett and Munk.
This calculation will “assure us that when we do put an ocean model to work in the context of a global warming calculation, for example, that we can feel secure that the physical process is properly represented,” he added.
University of Michigan oceanographer Brian Arbic said understanding the actions of internal waves more fully will also have a profound impact on the study of ocean temperatures, salinity, circulation and marine biology, which are “crucial for Earth’s climate, marine resources and uptake of carbon and heat by the Earth’s oceans.”
“This is a first for our community and implies that we have the potential for modelling internal gravity waves more realistically than ever before,” he said.
“The U of T supercomputer is extremely important to this work. It is a very large and cutting-edge machine. We would not be able to do this calculation right now without access to it.”
For Peltier, breaking internal waves caused by flow over bumps – whether mountain tops on the surface of the continent or on the ocean floor at great depth beneath the ocean surface – has been an area of intense interest for him since the beginning of his career.
He’s already planning how he’ll apply Niagara’s heroic calculation results to Ice Age conditions, when the level of water in the oceans was much lower and waves broke further offshore away from the continental slopes.
“The spectrum of waves in the oceans and the dissipation of waves should be dramatically different,” he said. “I’m expecting the stickiness of the ocean will change dramatically.”