The wildfire burned through pine trees only a few miles away from Martin City, Montana, just outside of Glacier National Park. It was growing steadily, but fire managers had reason to think the mile-wide Hungry Horse Reservoir would act as a buffer and protect the town. Still, they sent a team of responders to the other side, just in case.
Soon enough, a thunderstorm intensified the winds and sent firebrands flying across the northern tip of the lake, setting a new blaze. Firefighters responded immediately to protect a campground and homes before it could spread to the town.
The decision to send a crew across the reservoir in advance of the flames wasn’t just a lucky guess. Software helped responders see that strong winds could spread the fire. Then, when those conditions kicked in, they were ready. Property, trees and most important, lives, were saved.
Mark Finney, a researcher with the US Forest Service, analyzed the projections for the 2003 blaze near Hungry Horse with FarSite, a fire prediction program he wrote in 1992 that’s still used today. The software doesn’t turn fire analysts into fortune tellers — Finney says he didn’t know for sure the fire would jump the lake — but it lets them prepare for possibilities.
“That wasn’t a forecast that it would happen,” he says. “It was a scenario that showed what could happen.”
Programmers have been using software to analyze wildland fires and eventually make projections of where they might spread next, since computers came into existence. But following the fire at Hungry Horse, which was part of the larger Blackfoot Lake Complex Fire, the software programs written by government agencies and private companies for fire response teams have become more efficient and precise. Researchers are now creating systems that will more accurately predict fire movement, sometimes several days into the future, while computing labs are streamlining the way crucial information about fires is shared in real time. First responders can then adjust their projections within minutes — rather than hours — giving firefighters more time to respond to a blaze and stop it from spreading.
The improvements are needed because fire seasons in places like the western United States, Canada and Australia are getting longer and more destructive. The problem was clear in Northern California in August, when nearly 12,000 lightning strikes over a week sparked the second and third largest fires in the state’s history. As responders deal with several fire complexes that continue to burn near cities and towns and in rural communities, they’re relying on the fast-growing field of fire science and advances in software programming to handle the challenge.
From a base camp in California’s Napa County, outside the LNU Lightning Complex Fire, fire behavior analyst Robert Clark says he’s making projections using three different programs that help predict what the fire could do next. Stretching across five counties in the state’s wine country and redwood forests, the blaze, which began Aug. 17, has burned more than 375,000 acres. While no program can provide a perfect prediction, the software gives experts like Clark an idea of what might be coming. One of the programs, Wildfire Analyst, comes from Spanish software maker Technosylva. The company began partnering with California earlier this year and aims to clear up the chaos of information available to analysts like Clark.
“You have to be able to provide the precise amount of information that is meaningful,” says Technosylva founder Joaquin Ramirez.
More fire in the future
The 2020 fires are the latest in a series of unprecedented infernos locally and around the world. In California, they follow the Camp Fire of 2018, the deadliest and most destructive in the state’s history, burning 153,336 acres and devastating the town of Paradise in the Sierra Nevada foothills. At least 85 people were killed, and millions in the Bay Area 150 miles away were forced to shelter in place to avoid hazardous levels of air pollution. In Australia, a destructive wildfire season in 2019 and 2020 burned homes and businesses across a staggering 46.3 million acres, killing 35 people. An estimated 1 billion animals also died, leaving scientists to fear some vulnerable species like the Kangaroo Island dunnart are on the verge of extinction.
Andrew Sullivan, a fire research team leader for the Commonwealth Scientific and Industrial Research Organisation, an Australian government research agency, says the work of modeling massive fires isn’t easy.
“We’re trying to understand one of the most complex natural phenomena that anyone is likely to experience,” he says.
There are two reasons why wildfire emergencies are becoming more common: population and climate.
“People are living more in places prone to fire,” Sullivan says. “But changes to the climate are exposing more areas to the likelihood of fire.”
Climate change and fires are now caught in a feedback loop. Rising global temperatures make fires more likely because they extend dry seasons and create drier plant life that’s more likely to burn in hotter weather. Fires in turn release more carbon dioxide into the atmosphere and remove carbon-neutralizing trees from the environment.
Software can’t stop either of these factors, but it can make fire responders more nimble and help moderate the damage.
Getting ahead of fires
Humans began trying to model active wildfires in the early 20th century using analog tools. Radios, paper maps and tables of data guided fire responders, including my own grandfather.
In 1947 at age 18, Wilbur got a job in a lookout tower in the Kootenai National Forest of Montana. His charge was to call in any fires sparked in the wildland valley below, not far from where the Blackfoot Lake Complex Fire burned nearly 60 years later.
Teenagers in towers are no longer the height of fire intelligence, which now comes from drones, satellites and infrared cameras. But it took a lot of experimenting and improvements in computing power to create software that could run faster than fire.
In the days of mainframes and punch cards, researchers ran fire modeling software written in Fortran IV, an early programming language, and projected a fire’s spread in a one-dimensional line forward. Researchers could only see if their algorithms were correct after the fire, and there was little chance of projecting how a fire might move while it was still in progress.
Soon, faster supercomputers showed the potential to model fires in real time. But these room-sized, specialized and expensive machines weren’t available in the offices of fire response agencies around the country. Fire-modeling software had to work within the constraints of your typical government-budget PC. So programmers came up with workarounds.
Predicting the spread
First, they took what scientists already knew affected fire behavior: weather, wind speed, the kinds of plant life (or fuel type) in the region and how dry that fuel was. Then after analyzing that information, they created tables to show how fast the fire would spread. The next step was to take a one-dimensional movement of fire, which only gave a sense of the fire’s direction, and translate it to a two-dimensional map to show how a fire would grow in the next several hours or days.
This required a bit of “tricky geometry,” Sullivan says. What programmers landed on, he says, was a way to make a crude approximation of a fire perimeter.
They needed a simple rule for calculating how fire’s perimeter spreads. So they borrowed a formula from a different area of science: the movement of waves. It happened to be accurate enough to make predictions about wildfires, but also simple enough not to crash the computer in a fire response center.
Using waves as a stand-in for fire makes a certain amount of sense, if you picture the perimeter of a fire pulsing forward into the surrounding landscape like waves rippling out from a stone dropped in a pond. To be sure, fires are controlled by very different physical processes than waves are, but it works as an approximation. What mattered most was that the programs were small and nimble enough to work on regular PCs in the 1990s.
Updating the program
Fire scientists are now working on programs that predict the spread of fires based on the principles of computational fluid dynamics. This area of physics looks at how atmospheric forces play off one another at the molecular level, pushing at each other while transferring heat and physical matter around the environment. Unlike waves, these are the real physical forces that make fires burn, grow and move.
But since heavy computing power is required to run these physics-based programs, they still aren’t ready for prime time. As a result, fire scientists have looked to new programming techniques to get faster and more precise predictions from programs like Farsite, or the Australian equivalent, Phoenix RapidFire. Now that video and infrared images can stream in real time, for example, programmers can get fire data into the software faster than the days when it had to be transferred on memory cards — or film. And with better computing power, PCs can now run more complex, nimble software.
At Sullivan’s research agency in the Black Mountain Nature Reserve outside of Canberra, computer scientists have built a program that aims to be more adaptable and precise than Phoenix RapidFire. The resulting program for fire responders’ PCs, Spark, has made it easier to change out different types of data, including fuel type. That’s crucial, Sullivan says, because like all wildfires, Australia’s blazes behave very differently depending on what’s burning, whether it’s eucalypt forests (the oil inside the trees is incredibly flammable) or scrubbier bushland.
Spark gives scientists a new understanding of the way fire perimeters move. For example, it can more accurately portray how the edge of a fire will move when the curled, dry bark of the eucalyptus tree turns to embers, blowing more than 18 miles ahead of a blaze to set new fires. These far-flung embers are what most often puts homes in danger, Sullivan said.
Juicing up the algorithm
Wildfire can move incredibly fast — at one point, the 2018 Camp Fire spread at the equivalent of one football field every second — so it’s also critical that computers can analyze all of the data about a blaze quickly. Fire scientists at the Wifire lab in San Diego are developing a program that can digest real-time information about a fire’s location, plus weather conditions along with other data. The program, run out of the San Diego Supercomputer Center in partnership with UC San Diego, can feed this information into FarSite or any other fire-modeling program.
It could eventually feed the data into the physics-based programs being run out of supercomputers, says Wifire founder and director Ilkay Altintas.
“When it comes to fire modeling, I don’t think one size fits all,” Altintas says. Using a variety of different programs, she adds, can “help us use the right program for the right problem.”
The speed at which Wifire can digest information is helpful in two ways. First, the fast delivery of data allows fire modeling programs to make more precise predictions, creating new models within minutes based on real-time data. Second, Wifire’s program creates a feedback loop, comparing how fire-modeling software predicted a fire would move with what actually happened. The program can then update the underlying modeling algorithm, making it better at projecting how this specific fire will behave — all while the fire is still burning.
That’s drawn the interest of fire departments in California, including the Orange County Fire Authority, which partnered with the Wifire lab to take infrared images of wildfires from a plane and feed the data into the Wifire system.
And despite its name, Wifire isn’t just for blazes. Altintas says the goal is to use it for other disasters, like mapping the spread of floods, or the spread of smoke plumes in fires.
“We need to go beyond fire modeling,” she says. “So everything can advance together.”