Connect with us

Uncategorized

Gartner: Q3 smartphone sales down 5.7% to 366M, slicing Covid-19 declines in Q1, Q2

Published

on

We are now into the all-important holiday sales period, and new numbers from Gartner point to some recovery underway for the smartphone market as vendors roll out a raft of new 5G handsets.

Q3 smartphone figures from the analysts published today showed that smartphone unit sales were 366 million units, a decline of 5.7% globally compared to the same period last year. Yes, it’s a drop; but it is still a clear improvement on the first half of this year, when sales slumped by 20% in each quarter, due largely to the effects of Covid-19 on spending and consumer confidence overall.

That confidence is being further bolstered by some other signals. We are coming out of a relatively strong string of sales days over the Thanksgiving weekend, traditionally the “opening” of the holiday sales cycle. While sales on Thursday and Black Friday were at the lower end of predicted estimates, they still set records over previous years. With a lot of tech like smartphones often bought online, this could point to stronger numbers for smartphone sales as well.

On top of that, last week IDC — which also tracks and analyses smartphones sales — published a report predicting that sales would grow 2.4% in Q4 compared to 2019’s Q4. Its take is that while 5G smartphones will drive buying, prices still need to come down on these newer generation handsets to really see them hit with wider audiences. The average selling price for a 5G-enabled smartphone in 2020 is $611, said IDC, but it thinkgs that by 2024 that will come down to $453, likely driven by Android-powered handsets, which have collectively dominated smartphone sales for years.

Indeed, in terms of brands, Samsung, with its Android devices, continued to lead the pack in terms of overall units, with 80.8 million units, and a 22% market share. In fact, the Korean handset maker and China’s Xiaomi were the only two in the top five to see growth in their sales in the quarter, respectively at 2.2% and 34.9%. Xiaomi’s numbers were strong enough to see it overtake Apple for the quarter to become the number-three slot in terms of overall sales rankings. Huawei just about held on to number two. See the full chart further down in this story with more detail.

Also worth noting: overall mobile sales — a figure that includes both smartphones and feature phones — were down 8.7% 401 million units. That underscores not just how few feature phones are selling at the moment (smartphones can often even be cheaper to buy, depending on the brands involved or the carrier bundles), but also that those less sophisticated devices are seeing even more sales pressure than more advanced models.

Smartphone slump: it’s not just Covid-19

It’s worth remembering that even before the global health pandemic, smartphone sales were facing slowing growth. The reasons: after a period of huge enthusiasm from consumers to pick up devices, many countries reached market penetration. And then, the latest features were too incremental to spur people to sell up and pay a premium on newer models.

In that context, the big hope from the industry has been 5G, which has been marketed by both carriers and handset makers as having more data efficiency and speed than older technologies. Yet when you look at the wider roadmap for 5G, rollout has remained patchy, and consumers by and large are still not fully convinced they need it.

Notably, in this past quarter, there is still some evidence that emerging/developing markets continue to have an impact on growth — in contrast to new features being drivers in penetrated markets.

“Early signs of recovery can be seen in a few markets, including parts of mature Asia/Pacific and Latin America. Near normal conditions in China improved smartphone production to fill in the supply gap in the third quarter which benefited sales to some extent,” said Anshul Gupta, senior research director at Gartner, in a statement. “For the first time this year, smartphone sales to end users in three of the top five markets i.e., India, Indonesia and Brazil increased, growing 9.3%, 8.5% and 3.3%, respectively.”

The more positive Q3 figures coincide with a period this summer that saw new Covid-19 cases slowing down in many places and the relaxation of many restrictions, so now all eyes are on this coming holiday period, at a time when Covid-19 cases have picked up with a vengeance, and with no rollout (yet) of large-scale vaccination or therapeutic programs. That is having an inevitable drag on the economy.

“Consumers are limiting their discretionary spend even as some lockdown conditions have started to improve,” said Gupta of the Q3 numbers. “Global smartphone sales experienced moderate growth from the second quarter of 2020 to the third quarter. This was due to pent-up demand from previous quarters.”

Digging into the numbers, Samsung has held on to its top spot, although its growth was significantly less strong in the quarter. Even with that slump, Samsung is still a long way ahead.

That is in part because number-two Huawei, with 51.8 million units sold, was down by more than 21% since last year. It has been having a hard time in the wake of a public relations crisis after sanctions in the US and UK, due to accusations that its equipment is used by China for spying. (Those UK sanctions, indeed, have been brought up in timing, just as of last night.)

That also led Huawei earlier this month to confirm the long-rumored plan to sell off its Honor smartphone division. That deal will involve selling the division, reportedly valued at around $15 billion, to a consortium of companies.

It will be interesting to see how Apple’s small decline of 0.6% to 40.6 million units to Xiaomi’s 44.4 million, will shift in the next quarter, on the back of the company launching a new raft of iPhone 12 devices.

“Apple sold 40.5 million units in the third quarter of 2020, a decline of 0.6% as compared to 2019,” said Annette Zimmermann, research vice president at Gartner, in a statement. “The slight decrease was mainly due to Apple’s delayed shipment start of its new 2020 iPhone generation, which in previous years would always start mid/end September. This year, the launch event and shipment start began 4 weeks later than usual.”

Oppo, which is still not available through carriers or retail partners in the US, rounded out the top five sellers with just under 30 million phones sold. The fact that it and Xiaomi do so well despite not really having a phone presence in the US is an interesting testament to what kind of role the US plays in the global smartphone market: huge in terms of perception, but perhaps less so when the chips are down.

“Others” — that category that can take in the long tail of players who make phones, continues to be a huge force, accounting for more sales than any one of the top five. That underscores the fragmentation in the Android-based smartphone industry, but all the same, its collective numbers were in decline, a sign that consumers are indeed slowly continuing to consolidate around a smaller group of trusted brands.

 

Vendor 3Q20

Units

3Q20 Market Share (%) 3Q19

Units

3Q19 Market Share (%) 3Q20-3Q19 Growth (%)
Samsung 80,816.0 22.0 79,056.7 20.3 2.2
Huawei 51,830.9 14.1 65,822.0 16.9 -21.3
Xiaomi 44,405.4 12.1 32,927.9 8.5 34.9
Apple 40,598.4 11.1 40,833.0 10.5 -0.6
OPPO 29,890.4 8.2 30,581.4 7.9 -2.3
Others 119,117.4 32.5 139,586.7 35.9 -14.7
Total 366,658.6 100.0 388,807.7 100.0 -5.7

Source: Gartner (November 2020)

 

 

Continue Reading
Comments

Uncategorized

Watch Virgin Orbit launch a rocket to space from a modified 747 for the first time

Published

on

Virgin Orbit scored a major success on Sunday, with a test flight that not only achieved its goals of reaching space and orbit, but also of delivering payloads on board for NASA, marking its first commercial mission, too. The launch was a success in every possible regard, which puts Virgin Orbit on track to becoming an active launch provider for small payloads for both commercial and defense customers.

Above, you can watch the actual launch itself – the moment the LauncherOne rocket detaches from ‘Cosmic Girl,’ a modified Boeing 747 airliner that takes off normally from a standard aircraft runway, and then climbs to a cruising altitude to release the rocket, which then ignites its own engines and flies the rest of the way to space. Virgin Orbit’s launch model was designed to reduce the barriers to carrying small payloads to orbit vs. traditional vertical take-off vehicles, and this successful test flight proves the model works.

Virgin Orbit now joins a small but growing group of private launch companies who have actually reached space, and made it to orbit. That should be great news for the small satellite launch market, which still has much more demand than there is supply. Virgin Orbit also offers something very different from current launch providers like SpaceX, which typically serves larger payloads or which must offer rideshare model missions for those with smaller spacecraft. The LauncherOne design potentially means more on-demand, response and quick-turnaround launch services for satellite operators.

Continue Reading

Uncategorized

WhatsApp-Facebook data-sharing transparency under review by EU DPAs after Ireland sends draft decision

Published

on

A long-running investigation in the European Union focused on the transparency of data-sharing between Facebook and WhatsApp has taken the first major step towards a resolution. Ireland’s Data Protection Commission (DPC) confirmed Saturday it sent a draft decision to fellow EU DPAs towards the back end of last year.

This will trigger a review process of the draft by other DPAs. Majority backing for Facebook’s lead EU data supervisor’s proposed settlement is required under the bloc’s General Data Protection Regulation (GDPR) before a decision can be finalized.

The DPC’s draft WhatsApp decision, which it told us was sent to the other supervisors for review on December 24, is only the second such draft the Irish watchdog has issued to-date in cross-border GDPR cases.

The first case to go through the process was an investigation into a Twitter security breach — which led to the company being issued with a $550,000 fine last month.

The WhatsApp case may look very timely, given the recent backlash over an update to its T&Cs, but it actually dates back to 2018, the year GDPR begun being applied — and relates to WhatsApp Ireland’s compliance with Articles 12-14 of the GDPR (which set out how information must be provided to data subjects whose information is being processed in order that they are able to exercise their rights).

In a statement, the DPC said:

“As you are aware, the DPC has been conducting an investigation into WhatsApp Ireland’s compliance with Articles 12-14 of the GDPR in terms of transparency, including in relation to transparency around what information is shared with Facebook, since 2018. The DPC has provisionally concluded this investigation and we sent a draft decision to our fellow EU Data Protection Authorities on December 24, 2020 (in accordance with Article 60 of the GDPR in order to commence the co-decision-making process) and we are waiting to receive their comments on this draft decision.

“When the process is completed and a final decision issues, it will make clear the standard of transparency to which WhatsApp is expected to adhere as articulated by EU Data Protection Authorities,” it added.

Ireland has additional ongoing GDPR investigations into other aspects of the tech giant’s business, including related to complaints filed back in May 2018 by the EU privacy rights not-for-profit, noyb (over so called ‘forced consent’). In May 2020 the DPC said that separate investigation was at the decision-making phase — but so far it has not confirmed sending a draft decision for review.

It’s also notably that the time between the DPC’s Twitter draft and the final decision being issued — after gaining majority backing from other EU DPAs — was almost seven months.

The Twitter case was relatively straightforward (a data breach) vs the more complex business of assessing ‘transparency’. So a final decision on WhatsApp seems unlikely to come to a swifter resolution. There are clearly substantial differences of opinion between DPAs on how the GDPR should be enforced across the bloc. (In the Twitter case, for example, German DPAs suggested a fine of up to $22M vs Ireland’s initial proposal of a maximum of $300k). Although there is some hope that GDPR enforcement of cross border cases will speed up as DPAs gain experience of the various mechanisms and processes involved in making these co-decisions (even if major ideological gaps remain).

Returning to WhatsApp, the messaging platform has had plenty of problems with transparency in recent weeks — garnering lots of unwelcome attention and concern over the privacy implications of a confusing mandatory update to its T&Cs which has contributed to a major migration of users to alternative chat platforms, such as Signal and Telegram.

The backlash led WhatsApp to announced last week that it was delaying enforcement of the new terms by three months. Last week Italy’s data protection agency also issued a warning over a lack of clarity in the T&Cs — saying it could intervene using an emergency process allowed for by EU law (which would be in addition to the ongoing DPC procedure).

 

On the WhatsApp T&Cs controversy, the DPC’s deputy commissioner Graham Doyle told us the regulator had received “numerous queries” from confused and concerned stakeholders which he said led it to re-engage with the company. The regulator previously obtained a commitment from WhatsApp that there is “no change to data-sharing practices either in the European Region or the rest of the world”. But it subsequently confirmed it would delay enforcement of the new terms.

“The updates made by WhatsApp last week are about providing clearer, more detailed information to users on how and why they use data. WhatsApp have confirmed to us that there is no change to data-sharing practices either in the European Region or the rest of the world arising from these updates. However, the DPC has received numerous queries from stakeholders who are confused and concerned about these updates,” Doyle said.

“We engaged with WhatsApp on the matter and they confirmed to us that they will delay the date by which people will be asked to review and accept the terms from February 8th to May 15th. In the meantime, WhatsApp will launch information campaigns to provide further clarity about how privacy and security works on the platform. We will continue to engage with WhatsApp on these updates.”

While there’s no doubt Europe’s record of enforcement of its much vaunted data protection laws against tech giants remains a major weak point of the regulation, there are signs that increased user awareness of rights and, more broadly, concern for privacy, is causing a shift in the balance of power in favor of users.

Proper privacy enforcement is still sorely lacking but Facebook being forced to put a T&Cs update on ice for three months — as its business is subject to ongoing regulatory scrutiny — suggests the days of platform giants being able to move fast and break things are firmly on the wain.

Similarly, for example, Facebook recently had to delay the launch of a dating feature in Europe while it consulted with the DPC. It also remains limited in the data it can share between WhatsApp and Facebook because of the existence of the GDPR — so still can’t share data for ad targeting and product enhancement purposes, even under the new terms.

Europe, meanwhile, is coming with ex ante rules for platform giants that will place further obligations on how they can operate — with the aim of counteracting abusive/unfair business behaviors and bolstering competition in digital markets.

 

Continue Reading

Uncategorized

What the complex math of fire modeling tells us about the future of California’s forests

Published

on

At the height of California’s worst wildfire season on record, Geoff Marshall looked down at his computer and realized that an enormous blaze was about to take firefighters by surprise.

Marshall runs the fire prediction team at the California Department of Forestry and Fire Protection (known as Cal Fire), headquartered in Sacramento, which gives him an increasingly difficult job: anticipating the behavior of wildfires that become less predictable every year.

The problem was obvious from where Marshall sat: California’s forests were caught between a management regime devoted to growing thick stands of trees—and eradicating the low-intensity fire that had once cleared them—and a rapidly warming, increasingly unstable climate.

As a result, more and more fires were crossing a poorly understood threshold from typical wildfires—part of a normal burn cycle for a landscape like California’s—to monstrous, highly destructive blazes. Sometimes called “megafires” (a scientifically meaningless term that loosely refers to fires that burn more than 100,000 acres), these massive blazes are occurring more often around the world, blasting across huge swaths of California, Chile, Australia, the Amazon, and the Mediterranean region.

At that particular moment in California last September, several unprecedented fires were burning simultaneously. Together, they would double the record-setting acreage of the 2018 wildfire season in less than a month. But just as concerning to Marshall as their size was that the biggest fires often behaved in unexpected ways, making it harder to forecast their movements.

To face this new era, Marshall had a new tool at his disposal: Wildfire Analyst, a real-time fire prediction and modeling program that Cal Fire first licensed from a California-based firm called TechnoSylva in 2019.

The work of predicting how fires spread had long been a matter of hand-drawn ellipses and models so slow analysts set them before bed and hoped they were done in the morning. Wildfire Analyst, on the other hand, funnels data from dozens of distinct feeds: weather forecasts, satellite images, and measures of moisture in a given area. Then it projects all that on an elegant graphic overlay of fires burning across California.

A modeling tool called Wildfire Analyst shows how a blaze in California might spread over a period of eight hours. The red objects are buildings.

Every night, while fire crews sleep, Wildfire Analyst seeds those digital forests with millions of test burns, pre-calculating their spread so that human analysts like Marshall can do simulations in a matter of seconds, creating “runs” they can port to Google Maps to show their superiors where the biggest risks are. But this particular risk, Marshall suddenly realized, had slipped past the program.

The display now showed a cluster of bright pink and green polygons creeping over the east flank of the Sierras, near the town of Big Creek. The polygons, one of the many feeds ported directly into Wildfire Analyst, were from FireGuard, a real-time feed from the US Department of Defense that estimates all wildfires’ current locations. They were spreading, far faster than they should have been, up the Big Creek drainage.

In its calculations, Wildfire Analyst had made a number of assumptions. It “saw,” on the other side of Big Creek, a dense stand of heavy timber. Such stands were traditionally thought to impede the rapid spread of fire, which models attribute largely to fine fuels like pine straw.

But Marshall suddenly realized, as the algorithms driving Wildfire Analyst had not, that the drainage held all the ingredients for a perfect firestorm. That “heavy timber,” he knew, was in fact a huge patch of dead trees weakened by beetles, killed by drought, and baked by two weeks of 100 °F heat into picture-perfect firewood. And the Big Creek valley would focus the wind onto the fire like a bellows. With no weather station at the mouth of the creek, the program couldn’t see all that.

Marshall went back to his computer and re-ran some numbers with the new variables factored in. He watched on his screen as the fire spread at frightening speed across the Sierra. “I went to the operation trailer and told my uppers: I think it’s going to jump the San Joaquin River,” he recalls. “And if it does, it’s going to run big.”

This was, at that moment, a far-fetched claim—no California fire had ever made a nine-mile run in heavy timber, no matter how dry. But in this case, the trees’ combustion created powerful plumes of superheated air that drove the fire on. It jumped the river and raced through the timber to a reservoir known as Mammoth Pool, where a last-minute airlift saved 200 campers from fiery death.

The Creek Fire was a case study in the challenge facing today’s fire analysts, who are trying to predict the movements of fires that are far more severe than those seen just a decade ago. Since we understand so little about how fire works, they’re using mathematical tools built on outdated assumptions, as well as technological platforms that fail to capture the uncertainty in their work. Programs like Wildfire Analyst, while useful, give an impression of precision and accuracy that can be misleading.

Getting ahead of the most destructive fires will require not simply new computational tools but a sweeping change in how forests are managed. Along with climate change, generations of land and environmental management decisions—intended to preserve the forests that many Californians feel a duty to protect—have inadvertently created this new age of hyper-destructive fire.

But if these massive fires continue, California could see the forests of the Sierra erased as thoroughly as those of Australia’s Blue Mountains. Avoiding this nightmare scenario will require a paradigm shift. Residents, fire commanders, and political leaders must switch from a mindset of preventing or controlling wildfire to learning to live with it. That will mean embracing fire management techniques that encourage more frequent burns—and ultimately allowing fires to forever transform the landscapes that they love.

Shaky assumptions

In late October, Marshall shared his screen and took me on a tour in Wildfire Analyst. We watched the fluorescent FireGuard polygons of a new flame “finger” break out from the smoldering August Complex. With a few clicks, he laid four tiny virtual fires along the real fire’s edge, on the far side of the fire line that had blocked its progress. A few seconds later, fire blossomed across the simulated landscape. Under current conditions, the model estimated, a fire that broke out at those points could “blow out” to 8,000 acres—a nearly three-mile run—within 24 hours.

For Marshall and the rest of Cal Fire’s analysts, Wildfire Analyst provides a standardized platform on which to share data from fires they’re watching, projections about the runs they might make, and hacks to make a simulated fire approximate the behavior of a real one. With that information, they try to anticipate where a fire is going to go next, which in theory can drive decisions about where to send crews or which regions to evacuate.      

Like any model, Wildfire Analyst is only as good as the data that feeds it—and that data is only as good as our scientific understanding of the phenomenon in question. When it comes to the mechanics of wildland fire, that understanding is “medieval,” says Mark Finney, director of the US Forest Service’s Missoula Fire Lab.

Our current approach to fire modeling, which powers every real-time analytic platform including TechnoSylva’s Wildfire Analyst, is built on a particular set of equations that a researcher named Richard Rothermel derived at the Fire Lab nearly half a century ago to calculate how fast fire would move, with given wind conditions, through given fuels.

Rothermel’s key assumption—perhaps a necessary one, given the computational tools available at the time, but one we now know to be false—was that fires spread only through radiation as the front of the flame catches fine fuels (pine straw, leaf litter, twigs) on the ground.

That spread, Rothermel found, drove outward in a thin, expanding edge along an ellipse. To figure out how a fire would grow, firefighters in the field used “nomograms”: premade graphs that assigned specific values for wind speed, slope, and fuel conditions to reveal an average speed of spread.

fire behavior chart

US DEPARTMENT OF AGRICULTURE

In his early days in the field, Finney says, “you would spread your folder of nomograms on the hood of your pickup and make your projections in thick pencil,” charting on a topo map where the fire would be in an hour, or two, or three. Rothermel’s equations allowed analysts to model fire like a game of Go, across homogenous cells of a two-dimensional landscape.

This is where things have stood for decades. Wildfire Analyst and similar tools represent a repackaging of this approach more than a fundamental improvement on it. (TechnoSylva did not respond to multiple interview requests.) What’s needed now is less a technique for real-time prediction than a fundamental reappraisal of how fire works—and a concerted effort to restore California’s landscapes to something approaching a natural equilibrium.

Complications

The problem for products like Wildfire Analyst, and for analysts like Marshall, is easy to state and hard to solve. A fire is not a linear system, proceeding from cause to effect. It is a “coupled” system in which cause and effect are tangled up. Even on the scale of a candle, ignition kicks off a self-sustaining reaction that deforms the environment around it, changing the entire system further—fuel decaying into flame, sucking in more wind, which stokes the fire further and breaks down more fuel.

Such systems are notoriously sensitive to even small changes, which makes them fiendishly difficult to model. A small variance in the starting data can lead, as with the Creek Fire calculations, to an answer that is exponentially wrong. In terms of this kind of nonlinear complexity, fire is a lot like weather—but the computational fluid dynamic models that are used to build forecasts for, say, the National Weather Service require supercomputers. The models that try to capture the complexity of a wildland blaze are typically hundreds of times simpler.

Pioneering scientists like Rothermel dealt with this intractable problem by ignoring it. Instead, they searched for factors, such as wind speed and slope, that could help them predict a fire’s next move in real time.

Looking back, Finney says, it’s a miracle that Rothermel’s equations work for wildfires at all. There’s the sheer difference in scale—Rothermel derived his equations from tiny, controlled fires set in 18-inch fuel beds. But there are also more fundamental errors. Most glaring was Rothermel’s assumption that fire spreads only by radiation, instead of through the convection currents that you see when a campfire flickers.

This assumption isn’t true, and yet for some fires, even huge ones like 2017’s Northwest Oklahoma Complex, which burned more than 780,000 acres, Rothermel’s spread equations still seem to work. But at certain scales, and under certain conditions, fire creates a new kind of system that defies any such attempt to describe it.

The Creek Fire in California, for example, didn’t just go big. It created a plume of hot air that pooled under the stratosphere, like steam against the lid of a pressure cooker. Then it popped through to 50,000 feet, sucking in air from below that drove the flames on, creating a storm system—complete with lightning and fire tornadoes—where no storm should have been.

Other huge, destructive fires appear to ricochet off the weather, or each other, in chaotic ways. Fires usually quiet down at night, but in 2020, two of the biggest runs in California broke out at night. Since heat rises, fires usually burn uphill, but in the Bear Fire, two enormous flame heads raced 22 miles downhill, a line of tornadic plumes spinning between them.

Finney says we don’t know if the intensity caused the strange behaviors or vice versa, or if both rose from some deeper dynamic. One measure of our ignorance, in his view, is that we can’t even rely on it: “It would be really nice to know when our current models will work and when they won’t,” he says.

Illusions

To Finney and other fire scientists, the danger with products like Wildfire Analyst is not necessarily that they’re inaccurate. All models are. It’s that they hide solutions inside a black box, and—far more important—focus on the wrong problem.

Unlike Wildfire Analyst, the older generation of tools required analysts to know precisely what hedges and assumptions they were making. The new tools leave all that to the computer. Such products play into the field’s obsession with modeling, scientist after scientist told me, despite the fact that no model can predict what fire will do.

“You can always calibrate the system afterward to match your observations,” says Brandon Collins, a wildfire research scientist at UC Berkeley. “But can you predict it beforehand?”

Doing so is a question of science rather than technology: it would require primary research to develop and test a new theory of flame. But such work is expensive, and most wildfire research money is awarded to solve specific technical problems. The Missoula Fire Lab survives on the remnants of a Great Society–era budget; its sister facility, the Macon Fire Lab in Georgia, was shut down in the 1990s.

Collins and Finney are doing what they can with the funds available to them. They’re both part of a public-private fire science working group called Pyregence that’s converting a grain silo into a furnace to see how large logs, like the fallen timber on Big Creek, spread fire.

Meanwhile, Finney’s team at the Missoula Fire Lab is working to develop a data set that answers fundamental questions about fire—a potential basis for new models. They aim to describe how wind on smoldering logs drives new flame fronts; quantify the likelihood that embers cast by a flame will “spot,” or ignite, new fires; and study the role that pine forests seem to play in encouraging their own burning.

The point of those models is less to see where a particular fire will go once it’s broken out, and more to serve as a planning tool to help Californians better manage the fire-prone, fire-suppressed landscape they live in.

Like ecosystems in Chile, Portugal, Greece, and Australia—all regions that have recently seen more megafires—California’s conifer forests evolved over thousands of years in which natural and human-caused fires periodically cleared out excess fuel and created the space and nutrients for new growth.

Before the 19th century, Native Americans are thought to have deliberately burned about as much of California every year as burned there in 2020. Similar practices survived until as recently as the 1970s—ranchers in the Sierra foothills would burn brush to encourage new growth for their animals to eat. Loggers pulled tons of timber from forests groomed to produce huge volumes of it, burning the debris in place.

controlled burn technique

JOSH BERENDES / UNSPLASH

Then, as ranchers went bust and sold their land to developers, pastureland became residential communities. Clean-air regulations discouraged the remaining ranchers from burning. And decades of conflict between environmental organizations and logging companies ended, in the 1990s, with loggers deserting the forests they had once clear-cut.

In the Sierra—as in these other regions now prone to huge, destructive fires—a heavily altered landscape that was long ago torn from any natural equilibrium was largely abandoned. Millions of acres of pine grew in, packed and thirsty. Eventually many were killed by drought and bark beetles, accumulating into a preponderance of fuel. Fires that could have cleared the land and reset the forest were extinguished by the US Forest Service and Cal Fire, whose primary objective had become wholesale fire suppression.

Breaking free of this legacy won’t be easy. The future Finney is working toward is one where people can compare various models and decide which will work best for a given situation. He and his team hope better data will lead to better planning models that, he says, “could give us the confidence to let some fires burn and do our work for us.”

Still, he says, focusing too much on models risks missing a more important question: “What if we are ignoring the basic aspect of wildfire—that we need more fire, proper fire, so that we don’t let wildfire surprise and destroy us?”

Living with wildfires

In 2014, the King Fire raged across the California Sierra, leaving a burn scar where trees have still not regrown. Instead, says Forest Service silviculturist Dana Walsh, they’ve been replaced by thick mats of chaparral, a fire-prone shrub that has squeezed out the forest’s return.

“People ask what happens if we just let nature take its course after a big fire,” Walsh says. “You get 30,000 acres of chaparral.”

This is the danger that landscapes from the Pyrenees to California Sierra to Australia’s Blue Mountains now face, says Marc Castellnou, a Catalan fire scientist who is a consultant to TechnoSylva. Over the last two decades, he’s studied the rise of megafires around the world, watching as they smashed records for length or speed of runs.

For too long, he says, California’s fire and forest policy has resisted an inevitable change in the landscape. The state doesn’t need flawless predictive tools to see where its forests are headed, he says: “The fuel is building up, the energy is building up, the atmosphere is getting hotter.” The landscape will rebalance itself.

California’s choice—as in Catalonia, where Castellnou is chief scientist for the autonomous province’s 4,000-person fire corps—is to either move with that change and have some chance of influencing it, or be bowled over by megafires.

The goal is less to regenerate native forests in these areas—which Castellnou believes have been made obsolete by climate change—than to work with the landscape to develop a new type of forest where wildfires are less likely to blow out into massive blazes.

In large measure, his approach lies in returning to old land management techniques. Rural people in his region once controlled destructive fires by starting or allowing frequent, low-intensity fires, and using livestock to eat down brush in the interim. They planted stands of fire-resistant hardwood species that stood like sentinels, blocking waves of flame.

For Castellnou, though, this also means making politically difficult choices. In July 2019, just outside of Tivissa, Spain, I watched him explain to a group of rural Catalan mayors and olive farmers why he had let the area around their towns burn.

He’d worried that if crews slowed the Catalan fires, they might cause it to form a pyrocumulonimbus—a violent cloud of fire, thunder, and wind like the one that formed over the Creek Fire. Such a phenomenon could have spurred the fire on until it took the towns anyway. Now, he says, gesturing to the burn scar, the towns had a fire defense in place of a liability. It was another tile in a mosaic landscape of pasture, forest, and old fire scars that could interrupt wildfire.

As tough as planned burns are for many to swallow, letting wildfires burn through towns—even evacuated ones—is an even tougher sell. And replacing pristine Sierra Nevada forests with a landscape able to survive both drought and the most destructive fires—say, open stands of ponderosa pine punctuated by fields of grass, picked over by goats or cattle—might feel like a loss.

Doing any of this well means adopting a change in philosophy as big as any change in predictive tech or science—one that would welcome fire back as a natural part of the environment. “We are not trying to save the landscape,” Castellnou says. “We are trying to help create the next landscape. We are not here to fight flames. We are here to make sure we have a forest tomorrow.”

Continue Reading

Trending