Connect with us

Uncategorized

What the complex math of fire modeling tells us about the future of California’s forests

Published

on

At the height of California’s worst wildfire season on record, Geoff Marshall looked down at his computer and realized that an enormous blaze was about to take firefighters by surprise.

Marshall runs the fire prediction team at the California Department of Forestry and Fire Protection (known as Cal Fire), headquartered in Sacramento, which gives him an increasingly difficult job: anticipating the behavior of wildfires that become less predictable every year.

The problem was obvious from where Marshall sat: California’s forests were caught between a management regime devoted to growing thick stands of trees—and eradicating the low-intensity fire that had once cleared them—and a rapidly warming, increasingly unstable climate.

As a result, more and more fires were crossing a poorly understood threshold from typical wildfires—part of a normal burn cycle for a landscape like California’s—to monstrous, highly destructive blazes. Sometimes called “megafires” (a scientifically meaningless term that loosely refers to fires that burn more than 100,000 acres), these massive blazes are occurring more often around the world, blasting across huge swaths of California, Chile, Australia, the Amazon, and the Mediterranean region.

At that particular moment in California last September, several unprecedented fires were burning simultaneously. Together, they would double the record-setting acreage of the 2018 wildfire season in less than a month. But just as concerning to Marshall as their size was that the biggest fires often behaved in unexpected ways, making it harder to forecast their movements.

To face this new era, Marshall had a new tool at his disposal: Wildfire Analyst, a real-time fire prediction and modeling program that Cal Fire first licensed from a California-based firm called TechnoSylva in 2019.

The work of predicting how fires spread had long been a matter of hand-drawn ellipses and models so slow analysts set them before bed and hoped they were done in the morning. Wildfire Analyst, on the other hand, funnels data from dozens of distinct feeds: weather forecasts, satellite images, and measures of moisture in a given area. Then it projects all that on an elegant graphic overlay of fires burning across California.

A modeling tool called Wildfire Analyst shows how a blaze in California might spread over a period of eight hours. The red objects are buildings.

Every night, while fire crews sleep, Wildfire Analyst seeds those digital forests with millions of test burns, pre-calculating their spread so that human analysts like Marshall can do simulations in a matter of seconds, creating “runs” they can port to Google Maps to show their superiors where the biggest risks are. But this particular risk, Marshall suddenly realized, had slipped past the program.

The display now showed a cluster of bright pink and green polygons creeping over the east flank of the Sierras, near the town of Big Creek. The polygons, one of the many feeds ported directly into Wildfire Analyst, were from FireGuard, a real-time feed from the US Department of Defense that estimates all wildfires’ current locations. They were spreading, far faster than they should have been, up the Big Creek drainage.

In its calculations, Wildfire Analyst had made a number of assumptions. It “saw,” on the other side of Big Creek, a dense stand of heavy timber. Such stands were traditionally thought to impede the rapid spread of fire, which models attribute largely to fine fuels like pine straw.

But Marshall suddenly realized, as the algorithms driving Wildfire Analyst had not, that the drainage held all the ingredients for a perfect firestorm. That “heavy timber,” he knew, was in fact a huge patch of dead trees weakened by beetles, killed by drought, and baked by two weeks of 100 °F heat into picture-perfect firewood. And the Big Creek valley would focus the wind onto the fire like a bellows. With no weather station at the mouth of the creek, the program couldn’t see all that.

Marshall went back to his computer and re-ran some numbers with the new variables factored in. He watched on his screen as the fire spread at frightening speed across the Sierra. “I went to the operation trailer and told my uppers: I think it’s going to jump the San Joaquin River,” he recalls. “And if it does, it’s going to run big.”

This was, at that moment, a far-fetched claim—no California fire had ever made a nine-mile run in heavy timber, no matter how dry. But in this case, the trees’ combustion created powerful plumes of superheated air that drove the fire on. It jumped the river and raced through the timber to a reservoir known as Mammoth Pool, where a last-minute airlift saved 200 campers from fiery death.

The Creek Fire was a case study in the challenge facing today’s fire analysts, who are trying to predict the movements of fires that are far more severe than those seen just a decade ago. Since we understand so little about how fire works, they’re using mathematical tools built on outdated assumptions, as well as technological platforms that fail to capture the uncertainty in their work. Programs like Wildfire Analyst, while useful, give an impression of precision and accuracy that can be misleading.

Getting ahead of the most destructive fires will require not simply new computational tools but a sweeping change in how forests are managed. Along with climate change, generations of land and environmental management decisions—intended to preserve the forests that many Californians feel a duty to protect—have inadvertently created this new age of hyper-destructive fire.

But if these massive fires continue, California could see the forests of the Sierra erased as thoroughly as those of Australia’s Blue Mountains. Avoiding this nightmare scenario will require a paradigm shift. Residents, fire commanders, and political leaders must switch from a mindset of preventing or controlling wildfire to learning to live with it. That will mean embracing fire management techniques that encourage more frequent burns—and ultimately allowing fires to forever transform the landscapes that they love.

Shaky assumptions

In late October, Marshall shared his screen and took me on a tour in Wildfire Analyst. We watched the fluorescent FireGuard polygons of a new flame “finger” break out from the smoldering August Complex. With a few clicks, he laid four tiny virtual fires along the real fire’s edge, on the far side of the fire line that had blocked its progress. A few seconds later, fire blossomed across the simulated landscape. Under current conditions, the model estimated, a fire that broke out at those points could “blow out” to 8,000 acres—a nearly three-mile run—within 24 hours.

For Marshall and the rest of Cal Fire’s analysts, Wildfire Analyst provides a standardized platform on which to share data from fires they’re watching, projections about the runs they might make, and hacks to make a simulated fire approximate the behavior of a real one. With that information, they try to anticipate where a fire is going to go next, which in theory can drive decisions about where to send crews or which regions to evacuate.      

Like any model, Wildfire Analyst is only as good as the data that feeds it—and that data is only as good as our scientific understanding of the phenomenon in question. When it comes to the mechanics of wildland fire, that understanding is “medieval,” says Mark Finney, director of the US Forest Service’s Missoula Fire Lab.

Our current approach to fire modeling, which powers every real-time analytic platform including TechnoSylva’s Wildfire Analyst, is built on a particular set of equations that a researcher named Richard Rothermel derived at the Fire Lab nearly half a century ago to calculate how fast fire would move, with given wind conditions, through given fuels.

Rothermel’s key assumption—perhaps a necessary one, given the computational tools available at the time, but one we now know to be false—was that fires spread only through radiation as the front of the flame catches fine fuels (pine straw, leaf litter, twigs) on the ground.

That spread, Rothermel found, drove outward in a thin, expanding edge along an ellipse. To figure out how a fire would grow, firefighters in the field used “nomograms”: premade graphs that assigned specific values for wind speed, slope, and fuel conditions to reveal an average speed of spread.

fire behavior chart

US DEPARTMENT OF AGRICULTURE

In his early days in the field, Finney says, “you would spread your folder of nomograms on the hood of your pickup and make your projections in thick pencil,” charting on a topo map where the fire would be in an hour, or two, or three. Rothermel’s equations allowed analysts to model fire like a game of Go, across homogenous cells of a two-dimensional landscape.

This is where things have stood for decades. Wildfire Analyst and similar tools represent a repackaging of this approach more than a fundamental improvement on it. (TechnoSylva did not respond to multiple interview requests.) What’s needed now is less a technique for real-time prediction than a fundamental reappraisal of how fire works—and a concerted effort to restore California’s landscapes to something approaching a natural equilibrium.

Complications

The problem for products like Wildfire Analyst, and for analysts like Marshall, is easy to state and hard to solve. A fire is not a linear system, proceeding from cause to effect. It is a “coupled” system in which cause and effect are tangled up. Even on the scale of a candle, ignition kicks off a self-sustaining reaction that deforms the environment around it, changing the entire system further—fuel decaying into flame, sucking in more wind, which stokes the fire further and breaks down more fuel.

Such systems are notoriously sensitive to even small changes, which makes them fiendishly difficult to model. A small variance in the starting data can lead, as with the Creek Fire calculations, to an answer that is exponentially wrong. In terms of this kind of nonlinear complexity, fire is a lot like weather—but the computational fluid dynamic models that are used to build forecasts for, say, the National Weather Service require supercomputers. The models that try to capture the complexity of a wildland blaze are typically hundreds of times simpler.

Pioneering scientists like Rothermel dealt with this intractable problem by ignoring it. Instead, they searched for factors, such as wind speed and slope, that could help them predict a fire’s next move in real time.

Looking back, Finney says, it’s a miracle that Rothermel’s equations work for wildfires at all. There’s the sheer difference in scale—Rothermel derived his equations from tiny, controlled fires set in 18-inch fuel beds. But there are also more fundamental errors. Most glaring was Rothermel’s assumption that fire spreads only by radiation, instead of through the convection currents that you see when a campfire flickers.

This assumption isn’t true, and yet for some fires, even huge ones like 2017’s Northwest Oklahoma Complex, which burned more than 780,000 acres, Rothermel’s spread equations still seem to work. But at certain scales, and under certain conditions, fire creates a new kind of system that defies any such attempt to describe it.

The Creek Fire in California, for example, didn’t just go big. It created a plume of hot air that pooled under the stratosphere, like steam against the lid of a pressure cooker. Then it popped through to 50,000 feet, sucking in air from below that drove the flames on, creating a storm system—complete with lightning and fire tornadoes—where no storm should have been.

Other huge, destructive fires appear to ricochet off the weather, or each other, in chaotic ways. Fires usually quiet down at night, but in 2020, two of the biggest runs in California broke out at night. Since heat rises, fires usually burn uphill, but in the Bear Fire, two enormous flame heads raced 22 miles downhill, a line of tornadic plumes spinning between them.

Finney says we don’t know if the intensity caused the strange behaviors or vice versa, or if both rose from some deeper dynamic. One measure of our ignorance, in his view, is that we can’t even rely on it: “It would be really nice to know when our current models will work and when they won’t,” he says.

Illusions

To Finney and other fire scientists, the danger with products like Wildfire Analyst is not necessarily that they’re inaccurate. All models are. It’s that they hide solutions inside a black box, and—far more important—focus on the wrong problem.

Unlike Wildfire Analyst, the older generation of tools required analysts to know precisely what hedges and assumptions they were making. The new tools leave all that to the computer. Such products play into the field’s obsession with modeling, scientist after scientist told me, despite the fact that no model can predict what fire will do.

“You can always calibrate the system afterward to match your observations,” says Brandon Collins, a wildfire research scientist at UC Berkeley. “But can you predict it beforehand?”

Doing so is a question of science rather than technology: it would require primary research to develop and test a new theory of flame. But such work is expensive, and most wildfire research money is awarded to solve specific technical problems. The Missoula Fire Lab survives on the remnants of a Great Society–era budget; its sister facility, the Macon Fire Lab in Georgia, was shut down in the 1990s.

Collins and Finney are doing what they can with the funds available to them. They’re both part of a public-private fire science working group called Pyregence that’s converting a grain silo into a furnace to see how large logs, like the fallen timber on Big Creek, spread fire.

Meanwhile, Finney’s team at the Missoula Fire Lab is working to develop a data set that answers fundamental questions about fire—a potential basis for new models. They aim to describe how wind on smoldering logs drives new flame fronts; quantify the likelihood that embers cast by a flame will “spot,” or ignite, new fires; and study the role that pine forests seem to play in encouraging their own burning.

The point of those models is less to see where a particular fire will go once it’s broken out, and more to serve as a planning tool to help Californians better manage the fire-prone, fire-suppressed landscape they live in.

Like ecosystems in Chile, Portugal, Greece, and Australia—all regions that have recently seen more megafires—California’s conifer forests evolved over thousands of years in which natural and human-caused fires periodically cleared out excess fuel and created the space and nutrients for new growth.

Before the 19th century, Native Americans are thought to have deliberately burned about as much of California every year as burned there in 2020. Similar practices survived until as recently as the 1970s—ranchers in the Sierra foothills would burn brush to encourage new growth for their animals to eat. Loggers pulled tons of timber from forests groomed to produce huge volumes of it, burning the debris in place.

controlled burn technique

JOSH BERENDES / UNSPLASH

Then, as ranchers went bust and sold their land to developers, pastureland became residential communities. Clean-air regulations discouraged the remaining ranchers from burning. And decades of conflict between environmental organizations and logging companies ended, in the 1990s, with loggers deserting the forests they had once clear-cut.

In the Sierra—as in these other regions now prone to huge, destructive fires—a heavily altered landscape that was long ago torn from any natural equilibrium was largely abandoned. Millions of acres of pine grew in, packed and thirsty. Eventually many were killed by drought and bark beetles, accumulating into a preponderance of fuel. Fires that could have cleared the land and reset the forest were extinguished by the US Forest Service and Cal Fire, whose primary objective had become wholesale fire suppression.

Breaking free of this legacy won’t be easy. The future Finney is working toward is one where people can compare various models and decide which will work best for a given situation. He and his team hope better data will lead to better planning models that, he says, “could give us the confidence to let some fires burn and do our work for us.”

Still, he says, focusing too much on models risks missing a more important question: “What if we are ignoring the basic aspect of wildfire—that we need more fire, proper fire, so that we don’t let wildfire surprise and destroy us?”

Living with wildfires

In 2014, the King Fire raged across the California Sierra, leaving a burn scar where trees have still not regrown. Instead, says Forest Service silviculturist Dana Walsh, they’ve been replaced by thick mats of chaparral, a fire-prone shrub that has squeezed out the forest’s return.

“People ask what happens if we just let nature take its course after a big fire,” Walsh says. “You get 30,000 acres of chaparral.”

This is the danger that landscapes from the Pyrenees to California Sierra to Australia’s Blue Mountains now face, says Marc Castellnou, a Catalan fire scientist who is a consultant to TechnoSylva. Over the last two decades, he’s studied the rise of megafires around the world, watching as they smashed records for length or speed of runs.

For too long, he says, California’s fire and forest policy has resisted an inevitable change in the landscape. The state doesn’t need flawless predictive tools to see where its forests are headed, he says: “The fuel is building up, the energy is building up, the atmosphere is getting hotter.” The landscape will rebalance itself.

California’s choice—as in Catalonia, where Castellnou is chief scientist for the autonomous province’s 4,000-person fire corps—is to either move with that change and have some chance of influencing it, or be bowled over by megafires.

The goal is less to regenerate native forests in these areas—which Castellnou believes have been made obsolete by climate change—than to work with the landscape to develop a new type of forest where wildfires are less likely to blow out into massive blazes.

In large measure, his approach lies in returning to old land management techniques. Rural people in his region once controlled destructive fires by starting or allowing frequent, low-intensity fires, and using livestock to eat down brush in the interim. They planted stands of fire-resistant hardwood species that stood like sentinels, blocking waves of flame.

For Castellnou, though, this also means making politically difficult choices. In July 2019, just outside of Tivissa, Spain, I watched him explain to a group of rural Catalan mayors and olive farmers why he had let the area around their towns burn.

He’d worried that if crews slowed the Catalan fires, they might cause it to form a pyrocumulonimbus—a violent cloud of fire, thunder, and wind like the one that formed over the Creek Fire. Such a phenomenon could have spurred the fire on until it took the towns anyway. Now, he says, gesturing to the burn scar, the towns had a fire defense in place of a liability. It was another tile in a mosaic landscape of pasture, forest, and old fire scars that could interrupt wildfire.

As tough as planned burns are for many to swallow, letting wildfires burn through towns—even evacuated ones—is an even tougher sell. And replacing pristine Sierra Nevada forests with a landscape able to survive both drought and the most destructive fires—say, open stands of ponderosa pine punctuated by fields of grass, picked over by goats or cattle—might feel like a loss.

Doing any of this well means adopting a change in philosophy as big as any change in predictive tech or science—one that would welcome fire back as a natural part of the environment. “We are not trying to save the landscape,” Castellnou says. “We are trying to help create the next landscape. We are not here to fight flames. We are here to make sure we have a forest tomorrow.”

Continue Reading
Comments

Uncategorized

Freemium isn’t a trend — it’s the future of SaaS

Published

on

As the COVID-19 lockdowns cascaded around the world last spring, companies large and small saw demand slow to a halt seemingly overnight. Enterprises weren’t comfortable making big, long-term commitments when they had no clue what the future would hold.

Innovative SaaS companies responded quickly by making their products available for free or at a steep discount to boost demand.

While Zoom gets all the attention, there were hundreds of free SaaS tools to help folks through the pandemic. Pluralsight ran a #FreeApril campaign, offering free access to its platform for all of April. Cloudflare made its Teams product free from March until September 1, 2020. GitHub went free for teams in April and slashed the price of its paid Team plan.

A selection of new free, free trial and low-priced offerings from leading SaaS companies. Image Credits: Kyle Poyar/OpenView.

The free products were aimed squarely at end users — whether it be a developer, individual marketer, sales rep or someone else at the edge of an organization. These end users were stuck at home during the pandemic, yet they desperately needed software to power their working lives.

End users prefer to do the vast majority of their research online before ever talking to a sales rep, making free products the ideal way to reach them.

End users prefer to do the vast majority of their research online before ever talking to a sales rep, making free products the ideal way to reach them. Many end users want to jump straight into a product, no hassle or credit card or budget approval required.

After they’ve set up an account and customized it for their workflow, end users have essentially already made a purchase decision with their time — all without ever feeling like they were in an active buying cycle.

An end user-focused free offering became an essential SaaS survival strategy in 2020.

But these free offerings didn’t go away as lockdowns loosened up. SaaS companies instead doubled down on freemium because they realized that doing so had a real and positive impact on their business. In doing so, they busted the outdated myths that have held 82% of SaaS companies back from offering their own free plan.

Myth: A free offering will cannibalize paying customers

GoDaddy is a digital behemoth, known for being a ’90s-era pioneer in web domains as well as for their controversial Super Bowl ads. The company has steadily diversified into business software, now generating roughly $700 million in ARR from its business applications segment and reaching millions of paying customers. There are very few businesses that would see greater potential revenue cannibalization from launching a free product than GoDaddy.

But GoDaddy didn’t let fear stop them from testing freemium when lockdowns set in. Freemium started out as a small-scale experiment in spring 2020 for the websites and marketing product. GoDaddy has since increased the experiment to 50% of U.S. website traffic, with plans to scale to 100% of U.S. traffic and open availability to other markets in 2021.

Continue Reading

Uncategorized

Metafy adds $5.5M to its seed round as the market for games coaching grows

Published

on

This morning Metafy, a distributed startup building a marketplace to match gamers with instructors, announced that it has closed an additional $5.5 million to its $3.15 million seed round. Call it a seed-2, seed-extension or merely a baby Series A; Forerunner Ventures, DCM and Seven Seven Six led the round as a trio.

Metafy’s model is catching on with its market. According to its CEO Josh Fabian, the company has grown from incorporation to gross merchandise volume (GMV) of $76,000 in around nine months. That’s quick.

The startup is building in public, so we have its raw data to share. Via Fabian, here’s how Metafy has grown since its birth:

From the company. As a small tip, if you want the media to care about your startup’s growth rate, share like this!

When TechCrunch first caught wind of Metafy via prior seed investor M25, we presumed that it was a marketplace that was built to allow esports pros and other highly capable gamers teach esports-hopefuls get better at their chosen title. That’s not the case.

Don’t think of Metafy as a marketplace where you can hire a former professional League of Legends player to help improve your laning-phase AD carry mechanics. Though that might come in time. Today a full 0% of the company’s current GMV comes from esports titles. Instead, the company is pursuing games with strong niche followings, what Fabian described as “vibrant, loyal communities.” Like Super Smash Brothers, its leading game today in terms of GMV generated.

Why pursue those titles instead of the most competitive games? Metafy’s CEO explained that his startup has a particular take on its market — that it focuses on coaches as its core customer, over trainees. This allows the startup to focus on its mission of making coaching a full-time gig, or at least one that pays well enough to matter. By doing so, Metafy has cut its need for marketing spend, because the coaches that it onboards bring their own audience. This is where the company is targeting games with super-dedicated user bases, like Smash. They fit well into its build for coaches, onboard coaches, coaches bring their fans, GMV is generated model.

Metafy has big plans, which brings us back to its recent raise. Fabian told TechCrunch any game with a skill curve could wind up on Metafy. Think chess, poker or other games that can be played digitally. To build toward that future, Metafy decided to take on more capital so that it could grow its team.

So what does its $5.5 million unlock for the startup? Per its CEO, Metafy is currently a team of 18 with a monthly burn rate of around $80,000. He wants it to grow to 30 folks, with nearly all of its new hires going into its product org, broadly.

TechCrunch’s perspective is that gaming is not becoming mainstream, but that it has already done so. Building for the gaming world, then, makes good sense, as tools like Metafy won’t suffer from the same boom/bust cycles that can plague game developers. Especially as the startup becomes more diversified in its title base.

Normally we’d close by noting that we’ll get back in touch with the company in a few quarters to see how it’s getting on in growth terms. But because it’s sharing that data publicly, we’ll simply keep reading. More when we have a few months’ more data to chew on.

Continue Reading

Uncategorized

Snap to launch a new Creator Marketplace this month, initially focused on Lens Creators

Published

on

Snap on Wednesday announced its plan to soon launch a Creator Marketplace, which will make it easier for businesses to find and partner with Snapchat creators, including Lens creators, AR creators and later, prominent Snapchat creators known as Snap Stars. At launch, the marketplace will focus on connecting brands and AR creators for AR ads. It will then expand to support all Snap Creators by 2022.

The company had previously helped connect its creator community with advertisers through its Snapchat Storytellers program, which first launched into pilot testing in 2018 — already a late arrival to the space. However, that program’s focus was similar to Facebook’s Brand Collabs Manager, as it focused on helping businesses find Snap creators who could produce video content.

Snap’s new marketplace, meanwhile, has a broader focus in terms of connecting all sorts of creators with the Snap advertising ecosystem. This includes Lens Creators, Developers and Partners, and then later, Snap’s popular creators with public profiles.

Snap says the Creator Marketplace will open to businesses later this month to help them partner with a select group of AR Creators in Snap’s Lens Network. These creators can help businesses build AR experiences without the need for extensive creative resources, which makes access to Snap’s AR ads more accessible to businesses, including smaller businesses without in-house developer talent.

Lens creators have already found opportunity working for businesses that want to grow their Snapchat presence — even allowing some creators to quit their day jobs and just build Lenses for a living. Snap has been further investing in this area of its business, having announced in December a $3.5 million fund directed toward AR Lens creation. The company said at the time there were tens of thousands of Lens creators who had collectively made over 1.5 million Lenses to date.

Using Lenses has grown more popular, too, the company had noted, saying that more than 180 million people interact with a Snapchat Lens every day — up from 70 million daily active users of Lenses when the Lens Explorer section first launched in the app in 2018.

Now, Snap says that over 200 million Snapchat users interact with augmented reality on a daily basis, on average, out of its 280 million daily users. The majority (over 90%) of these users are 13 to 25-year-olds. In total, users are posting over 5 billion Snaps per day.

Snap says the Creator Marketplace will remain focused on connecting businesses with AR Lens Creators throughout 2021.

The following year, it will expand to include the community of professional creators and storytellers who understand the current trends and interests of the Snap user base and can help businesses with their ad campaigns. The company will not take a cut of the deals facilitated through the Marketplace, it says.

This would include the creators making content for Snap’s new TikTok rival, Spotlight, which launched in November 2020. Snap encouraged adoption of the feature by shelling out $1 million per day to creators of top videos. In March 2021, over 125 million Snapchat users watched Spotlight, it says.

Image Credits: Snapchat

Spotlight isn’t the only way Snap is challenging TikTok.

The company also on Wednesday announced it’s snagging two of TikTok’s biggest stars for its upcoming Snap Originals lineup: Charli and Dixie D’Amelio. The siblings, who have gained over 20 million follows on Snapchat this past year, will star in the series “Charli vs. Dixie.” Other new Originals will feature names like artist Megan Thee Stallion, actor Ryan Reynolds, twins and influencers Niki and Gabi DeMartino, and YouTube beauty vlogger Manny Mua, among others.

Snap’s shows were watched by over 400 million people in 2020, including 93% of the Gen Z population in the U.S., it noted.

 

Continue Reading

Trending