This article focuses on research underway to build uncertainty into supercomputer simulations of hypersonic flight engines. A team of researchers headquartered at Stanford University is trying to quantify the uncertainty inherent in their supercomputer computations so that engineers can build appropriate tolerances into their designs. The group is investigating hypersonic aircraft, which would fly through the atmosphere at six or twelve times the speed of sound—and in particular the engine that might power such flight. The resulting simulation, scientists believe, will give a more accurate depiction of reality than a simulation that doesn’t correct for computational uncertainty. In the process of conducting their research, group members have created a new computer language that will make it easier to program and run complex simulations on the supercomputers of the future. This programming language of the future separates the computation from the coding.
It’s impossible to model the physical world exactly; it can’t be done with clay, by digital simulation, even by supercomputer. And it certainly can’t be done for an aircraft engine that has not yet been perfected.
The subject comes up as a team of researchers headquartered at Stanford University tries to quantify the uncertainty inherent in their supercomputer computations so that engineers can build appropriate tolerances into their designs.
The group is investigating hypersonic aircraft, which would fly through the atmosphere at six or twelve times the speed of sound—and in particular the engine that might power such flight. Hypersonic passenger flight, if ever realized, could one day change the way we traverse the globe—from Chicago to Tokyo in under two hours, for example.
In the process of conducting their research, group members have created a new computer language that will make it easier to program and run complex simulations on the supercomputers of the future.
Measuring what the engineers call epistemic uncertainty is common in real-world experimental work, where researchers typically note uncertainties in their measurements in the form of error bars representing ranges of upper and lower limits.
But hard numbers generated by computer models don’t have uncertainty bars and therefore can take on an unwarranted, potentially dangerous air of certainty, said Parviz Moin, a professor of mechanical engineering and director of Stanford’s Center for Turbulence Research.
“We’ve asked experimentalists for a long time to give us uncertainty bars on their measurements,” he said. “But I think the time is right to ask the computationalists to do the same.”
And that uncertainty is important as team members call upon supercomputer simulations to understand hypersonic flight, which hasn’t yet proved successful.
What You Don’t Know
The first step toward quantifying computational uncertainty—which is the range of uncertainty presented by the error bars—is acknowledging that uncertainty is inherent in computational simulations, just as it is in real-world experiments, said Juan Alonso, associate professor of aeronautics and astronautics at Stanford.
“When you base decisions on computations that are in some way imperfect, you make errors,” he said.
The key to understanding that imperfection is in knowing what you don’t know in order to quantify computational error, Alonso added. With that knowledge, scientists can then build computational uncertainty into a supercomputer simulation. The resulting simulation will give a more accurate depiction of reality than a simulation that doesn’t correct for computational uncertainty.
Alonso is part of Stanford’s Predictive Science Academic Alliance Program, which is attempting to quantify the uncertainty inherent in the numerical predictions that drive supercomputer simulations. PSAAP is a collaboration of the departments of mechanical engineering, aeronautics and astronautics, computer science, and mathematics, plus Stanford’s Institute for Computational and Mathematical Engineering.
PSAAP researchers use supercomputers to model the physical complexities of the hypersonic environment—specifically how fuel and air flow through a hypersonic aircraft engine known as a scramjet engine. The project has received a five-year, $20 million grant from the U.S. Department of Energy, Moin said. He’s also faculty director of PSAAP.
A hypersonic aircraft would fly at speeds up to Mach 12, or around 9,200 miles per hour, allowing transit between any two points on the globe in less than two hours. The aircraft would need to be powered by a scramjet engine, in which combustion takes place in airflow above the speed of sound.
Hypersonic flight is a particularly intense engineering challenge because engineers must account for the mechanical forces placed on the structure of the plane and the physics of the sophisticated engines that must operate in the extremes of the upper atmosphere where the planes would fly, Alonso said.
“When you want to fly at six to 15 times the speed of sound, the supersonic combustion ramjet, or scramjet is an option,” Alonso said. “Essentially it’s a duct going at high speeds; you inject fuel in the duct and burn it.”
The scramjet relies on high vehicle speed to compress and decelerate the incoming air before combustion. Airflow throughout the entire engine remains supersonic-meaning the air travels through the engine faster than the speed of sound. This allows the scramjet to operate efficiently at extremely high speeds.
Because they lack mechanical compressors, scramjets require the high kinetic energy of a hypersonic flow to compress the incoming air to operational conditions.
But for the most part the engines have been subject to few flight tests, Alonso said, and all have been problematic. Hypersonic flight within the atmosphere generates immense drag, and temperatures found on the aircraft and within the engine can be much greater than that of the surrounding air.
Real-world laboratories can only go so far in reproducing such conditions, and test vehicles are rendered extraordinarily vulnerable. Many hypersonic aircraft tests have ended in vehicle failure, Alonso said.
“We do wind tunnels on the ground as we can,” Alonso said. “But there’s always surprises when we make guesses that we can’t afford to simulate.”
All that said, there have been two successful U.S. governmental tests of the engine.
In 2004, NASA’s X-43A scramjet-powered unmanned experimental hypersonic research aircraft made two flights, one at Mach 7 and the other at Mach 10. It was launched from a carrier plane and used a booster rocket to reach target speed and altitude. The booster was discarded, and the X-43 flew free using its own scramjet engine. Booster rockets couldn’t work for passenger hypersonic flight, as they couldn’t be routinely discarded.
Then in May 2010, the Boeing X-51 WaveRider completed a flight. The aircraft is an unmanned scramjet demonstration aircraft for hypersonic flight testing at speeds around Mach 6. This aircraft also uses a rocket booster.
“Even in the successful tests, they’ve had some incidents, like unstart,” Alonso said. “So instead of having air go through in proper direction, the flow reverses, and the aircraft falls out of sky.”
Researchers at the PSAAP are specifically focusing on the unstart problem, Moin said.
“If you put too much fuel in the engine when you try to start it, you get a phenomenon called thermal choking, where shock waves propagate back through the engine,” Moin said. “Essentially, the engine doesn’t get enough oxygen and it dies. It’s like trying to light a match in a hurricane.”
A simulation of temperature fluctuations from a scramjet engine’s exhaust recently run at Stanford is one of the largest engineering calculations ever undertaken, he added.
The group has been running simulations but researchers realize these simulations can never perfectly represent reality, Alonso said.
“When we do computer simulations we try to simulate the real world as best we can,” Alonso said. “But with the limitation of cost and power available, we have to make choices as to how we model the real world and to make tradeoffs that result in a certain number of errors that creep into our computations.
“A major problem is that we can’t afford to model the most complete physics of an object. We’d need to run that on computers 100 to 1,000 times larger than we have available today,” he said. “So we do modeling of it in certain areas, like combustion pressure. And we get something close. We get good answers when we do that, but with some errors.”
In addition to computational errors, the hypersonic vehicles are themselves subject to uncertainties in how they behave in the air, Alonso said.
Because so few physical models exist for hypersonic aircraft and scramjet engines, researchers can have a hard time telling how accurate their simulations truly are. They don’t have successful measurements from physical prototypes to compare against, Alonso said.
That’s why the researchers are looking for ways to quantify the uncertainty inherent in the simulations, he added.
“We’re still learning about scramjet simulations because there’s only a few of them, unlike with commercial air flight,” he said. “But without the error bars we’ll keep doing this and it’ll keep failing.”
“We don’t want to just predict the solution; we want to predict the error bars on the solution,” he added.
The error bars will help the scramjet engineers built appropriate tolerances into their designs.
To get there, researchers have tested their computational work against physical experiments run in wind tunnels, Alonso said.
They’ve also partnered with the Stanford High Temperature Gasdynamics Lab and the Flow Physics and Computational Engineering group to develop a number of new techniques to validate how they build physical models into the PSAAP computer code, Moin added.
They also run smaller simulations, quantify the uncertainty found within those simulations, and then roll the uncertainty up within progressively larger simulations, Alonso said.
Collectively, these insights will enable the design of safer, more reliable hypersonic engines. But PSAAP’s advances in quantifying uncertainty have other, far broader implications, Alonso said.
“These same technologies can be used to quantify flow of air around wind farms, for example, or for complex global climate models,” he said. “I was in Los Alamos talking with people who are interested in global climate, and guess what? Just like the models for the scramjet, right now their climate models are far from perfect, but it doesn’t stop them from pushing ahead.”
PSAAP researchers run their simulations on supercomputers at Lawrence Livermore National Laboratory in California and Sandia National Laboratories in Albuquerque, N.M.
To run their simulations, the PSAAP researchers use an open- source dataset called HyShot 2. Researchers in the Centre for Hypersonics at the University of Queensland in Brisbane, Australia, developed the original dataset for their HyShot project that aimed to demonstrate the possibility of supersonic combustion under flight conditions and compare the results of shock-tunnel experiments.
The project has involved one successful launch of a scramjet designed by the British company QinetiQ and the successful launch of one engine designed by the University of Queensland. Both scramjets were accelerated by a rocket.
For PSAAP, modeling the unstart phenomenon requires a clear understanding of the physics and then reproducing mathematically the immensely complex interactions that occur at hypersonic speeds, which explains why researchers from various backgrounds are working together on the project.
For example, one reason computational uncertainty quantification is a relatively new science is that, until recently, the necessary computer resources simply didn’t exist.
“Some of our latest calculations run on 163,000 processors simultaneously,” Moin said. “I think they’re some of the largest calculations ever undertaken.”
It takes specialized knowledge to get computers of this scale to perform effectively, and that’s not something scientists and engineers should worry about, Alonso said.
Right now, professors like Alonso and Moin are doing their own programming and coding, based on their needs, he added.
“Mechanical engineers and those of us in aeronautics and astronautics understand the flow and combustion physics of scramjet engines and the predictive tools. We need the computer scientists to help us figure out how to run these tests on these large computers,” Alonso said.
That need will only increase over the next decade as supercomputers move toward the exascale—computers with a million or more processors able to execute a quintillion calculations in a single second, said Patrick Hanrahan, a Stanford computer science professor who’s also at work on the PSAAP project.
“From a computer scientist’s point of view, the interest in exascale computing is to build exaflop machines that use only 20 megawatts of power to run data centers and cell phones,” Hanrahan said.
Today’s most powerful supercomputers—like those that run the PSAAP simulations—are in the 10 to 20 petaflop range, he added. Very large, but still small when compared to today’s simulation and computational power needs.
Hanrahan predicts a 1,000-fold increase in computational power by 2018.
“That’ll put us in the range we need, but no one knows what those computers will look like,” he said. “They’ll likely have millions of processors and independent computational units. And no one knows how they’ll be programmed.”
To help with the future programming project, Hanrahan and several colleagues have created Liszt. It’s a programming language named after Franz Liszt, a 19th century Hungarian composer, to program computers that don’t yet exist.
This programming language of the future separates the computation from the coding, he said.
“Liszt can solve partial differential equations—like for fluid flow and turbulence—and then it automatically writes low-level code that can target any one of these future machines without engineers having to rewrite code for these new computers that will come along,” Hanrahan said.
“This is domain-specific language that can target any machine that will come along,” he added. “But we also want to build a system that’s easy to use. We don’t want everyone to turn into a computer scientist, so we’re trying to make this a productive programming environment.”
Liszt will also help quantify uncertainty within simulations run on these future machines, Alonso said.
“All the techniques we’re developing now to put error bars on simulations mean you have to run the simulations multiple times to create the bars,” he said. “But Liszt will allow us to run on exascale machines without the need to run as many simulations to find the uncertainty.”
And, in the future, that uncertainty will be easily included in all simulations. So does that mean we’ll be flying faster to Tokyo from Chicago? Maybe.