Wildfires Are Not Just a West Coast Problem

July 27, 2021 § Leave a comment

The Bootleg Fire (New York Times, July 23, A11) is a public health concern for much of the country. While the devastation from burning property is local, the emissions can travel thousands of miles. These emissions primarily comprise particles of solid carbon (soot) coated with organic molecules. Particles under 2.5 microns in diameter (a micron is one millionth of a meter and a human hair is about 70 microns in diameter) are designated PM2.5 and are more toxic than the larger PM10. This is because they can penetrate deep into the lungs. Regulations worldwide focus on PM2.5.

Smaller particles travel further because they are lighter. The total concentration of particles distant from the fire will be lower than near the source. But these particles will largely be PM2.5, also known as fine particles. Even smaller particles, known as ultrafine or nanoparticles, are a subset of these, and are referred to by scientists as PM0.1, indicating particles smaller than 0.1 microns. Strong evidence points to these particles being even more toxic than the rest of the PM2.5. There are two reasons for that. One is that the size allows them to penetrate individual cells in the organs, carrying their toxic organic payload. The other is that these tiny particles are much more likely to grab toxins from the atmosphere than the larger ones. That makes them more virulent. And finally, these ultrafine particles travel even further from the site of the fires than the fine particles.

For any given size of particle from a wildfire, toxicity of the particle is related to the type of wood and the conditions under which the burn occurs. A recent study concludes that combustion of eucalyptus, ubiquitous in Australia and common in northern California, results in particulate matter that is more toxic than from pine, peat, and red oak. The measures of toxicity in the study were markers for inflammation in the lungs of mice (1).

Fierce, hot burns produce particles with lower toxicity than smoldering burns. The natural reaction from a property protection standpoint is to douse the flames. However, if left to smolder, the detriment is to public health. The reason is that a smolder is relatively oxygen starved and produces more unburnt organic molecules (mostly from the outermost layers of the limb rich in aromatic compounds). These volatile organics then attach to the carbon particles emitted during combustion as soot, making them more toxic. The same lung toxicity study cited above noted a striking increase in toxicity in the smoldering condition (1).

The Bootleg Fire is the third largest fire in Oregon history. The burn area is close to 400,000 acres. To give that context, large fires are classified by statisticians as greater than 1000 acres of burn. Scientists use the metric of area burned rather than numbers of fires. Privation is certainly proportionate to the area of devastation.

Courtesy US EPA 2019 (Wildland Fire Research Framework 2019-2022)

EPA data show that over the three decades preceding 2018, the total numbers of fires have remained essentially constant. Not surprising, since most are caused by human behavior, which, absent intervention, tends to remain constant. Strikingly, however, over the same period, the area burned has nearly doubled (3). This statistic does not include the devastation of the 2020 fire season. Five of the ten largest fires ever in California were in that year. The largest was the August Complex fire, which burned just over a million acres.

The Bootleg Fire is believed to have been triggered by lightning strikes on dry underbrush. The mammoth August Complex fire in 2020 was believed to have been caused by 38 lightning strikes. Lightning as a cause of wildfires has not been in the top four causes, judged by the area burned (2). This is because lightning is ordinarily associated with a thunderstorm, and even 0.1 inches of rain is sufficient to materially reduce the ignition of underbrush. However, with high ambient temperatures increasingly being experienced, “dry lightning” can be produced. The associated rain evaporates prior to hitting the ground.

Until now the dogma has been that the causes of most wildfires are anthropogenic. In California, the four principal causes of ignition were arson, campfires, equipment (such as chain saws) and falling power lines, when the metric was area burned (2). All being related to human endeavor, amelioration was possible. But if the world is moving towards being hotter, the incidences of dry lightning could go up. Death Valley, CA recorded a high temperature of 130 F on August 16, 2020. This was the highest recorded temperature on the planet. August 16 was also the day of the first dry lightning strike associated with the August Complex fire. I am not suggesting causality, but the facts urgently suggest study*.

The literature does not offer promise for direct intervention in dry lightning caused large area devastation, other than controlled burns to clear out the under-canopy vegetation, which is already practiced to a degree. This stark reality applies not just to the fire prone western US, but also to the rest of the country.

Vikram Rao

* ” Lightning’s striking again” in Lightnin’ Strikes, by Lou Christie (1965), written by Lou Christie and Twyla Herbert

References

1. Kim YH, Warren SH, Krantz QT, et al. Mutagenicity and lung toxicity of smoldering vs. flaming emissions from various biomass fuels: implications for health effects from wildland fires. Environ Health Perspect. 2018;126(1):2018.

2. Syphard, A. D., & Keeley, J. E. (2015). Location, timing and extent of wildfire vary by cause of ignition. International Journal of Wildland Fire, 24(1), 37. https://doi.org/10.1071/WF14024

3. US EPA 2019 (Wildland Fire Research Framework 2019-2022)

Advertisement

WILDFIRES: THE WORST IS YET TO COME

September 21, 2020 § 2 Comments

California is ablaze. So are Oregon and Washington. The tally to date is 5 million acres burned, about halfway through the fire season, and well on its way to record territory. Putting that in perspective, the east coast of Australia, devastated similarly earlier this year in the Southern Hemisphere summer, closed the season with 46 million acres burned.

The statistic of greatest concern is that the intensity and scale of the fires is getting worse. Over the last thirty years, the number of fires annually has no discernible trend; certainly, has not gone up. But the acreage burned has; decisively. Both patterns are evident in the figure below. Five of the ten largest fires ever in California are currently active. The largest of these, the August Complex is already at 839,000 acres and still going. The next largest, ever, was 459,000 acres, the Mendocino Complex in 2018. Labeling any of this chance, or poor forestry management, evokes imagery of the proverbial ostrich, and the placement of its head.

This image has an empty alt attribute; its file name is wildfire-number-and-size-1988-2018.png
Courtesy US EPA 2019 (Wildland Fire Research Framework 2019-2022)

The average hectares (a hectare is roughly 2.47 acres) burned has nearly doubled over this three-decade period. Nine of the ten largest fires have occurred since the year 2000. Note that this does not include the ongoing five, which certainly would be in that group, making it 14 of the 15 since 2000. Although a regression line would have uncertainty due to big annual swings, an eyeball estimate indicates a strong upward slope. If this is a predictor of the future, that future is indeed bleak and warrants a study of causes.

The recent EPA report, from which the figure was reproduced, ascribes the pattern of increased fire acreage to higher temperatures, drought, early snow melts and historically high fuel loading (which is the fire prone vegetation, including underbrush). We will examine these separately, although they may not be disconnected. But first, a comment on the pattern of numbers of fires being essentially flat. Ignition events determine numbers of fires. In California, the principal ones are arson, campfires, power lines and equipment. The equipment category comprises items such as power saws, mowers, and other operated machinery. Human behavior, absent intervention, can be expected to be constant. So, the flat profile on numbers of fires is to be expected. Interestingly, the incidences are seasonal, even, counter-intuitively, arson.

Climate change is implicated in many of the causes of increasing severity over the years. While the term has many interpretations, one generally accepted aspect is temperature rise in the atmosphere and in the oceans. The debate is not whether this happens, but how fast it does. Also generally accepted (to the extent any climate change causality is generally accepted) is that oceanic temperature rise causes increased severity in the El Niño phenomenon in the Pacific Ocean, which is responsible for catastrophic droughts. These are accompanied by drenching rains in other parts of the world in the same year. Both disturbances are extreme deviations from the norm, with resultant impact on vegetation and the way of life.

Atmospheric temperature rise can also be expected to change the proportion of rain and snow in precipitation. Lighter snowfall can be a result, as also early snow melts. Both are in the EPA list noted above.

California, being generally arid, gets most of its water supply from melting snow. While less snow in a given year is certainly a drought indicator, the phenomenon most studied is that of the timing of the snow melt. Data from four decades commencing in 1973 conclusively demonstrated that burn acreage was strongly correlated with the earliness of the snow melt (Westerling 2016). Decadal comparisons show that the fire seasons in 2003-2012 averaged 40 more days that the seasons in 1973-1982. Fires in the later decade were more severe. Large fires, defined as covering greater than 400 hectares, burned for 6 days on average in the early decade and for more than 50 days in 2003-2012.

Power lines deserve special mention.  Falling power lines were blamed for several fires in 2017 and 2018. The utility has accepted blame and is in bankruptcy. Trees falling on power lines snapped the poles. The tree roots, finding uncertain purchase due to drought conditions, were no match for the Santa Ana winds or any other storm sourced shoves. Those same drought conditions caused the underbrush to be dry. Power lines are usually not insulated.  Sparking wires on dry underbrush and the rest is incendiary history. A poster child for distributed power.

The wildfire future is indeed bleak. Climate change retardation is necessary. But it may not be sufficient in the shorter term. We need a reincarnation of Smoky to change human behavior to minimize the ignition events.

Westerling, A. L. (2016) ‘Increasing western US forest wildfire activity: sensitivity to changes in the timing of spring’, Philosophical Transactions of the Royal Society B: Biological Sciences, 371: 20150178. http://dx.doi.org/10.1098/rstb.2015.0178

Vikram Rao September 21, 2020

The Devil and the Deep Blue Sea

September 4, 2022 § Leave a comment

California’s recent decarbonization legislation includes extending the life of the Diablo Canyon nuclear reactors in the face of environmentalist opposition. Their concern has been for the marine creatures potentially killed during cooling water uptake from the ocean. The dilemma posed in the title, similar to between a rock and a hard place, applies to the Diablo Canyon decision. A recent paper from Stanford and MIT details the issues and lands in the extended life camp with some twists discussed later here.

Back to the dilemma. No form of energy, clean or otherwise, comes without baggage. So, it comes down to compromises. Wind has avian mortality and visual pollution. Solar may carry the least baggage, but recent events pose a unique twist. The price of natural gas going up 5 and 6-fold in Europe due to climate change and Russian aggression shows that reliance on a global supply chain could be fraught. In context, over 60% of solar panel components originate in China. Sabers are rattling in the Taiwan Strait. No telling what happens to solar panel costs if things escalate.

More dilemma: opponents of the decision want to simply build more solar and wind capacity. Even Senator Dianne Feinstein weighed in with the opinion that absent the Diablo decision there would be more natural gas usage. Exactly right, especially if the course of action proposed by opponents, more solar and wind, is followed. This is because solar and wind have low capacity utilization due to diurnal and seasonal gaps in output. At this time these gaps are dominantly filled by natural gas power generation. In other words, more solar and wind means more natural gas burned until carbon-free gap fillers, such as advanced geothermal systems and small modular (nuclear) reactors, hit their stride. And that will take a decade. In the meantime, Diablo Canyon 24/7 output notwithstanding, natural gas will continue to increasingly be used in step with addition of solar and wind capacity. A mitigative measure on the associated CO2 production would be carbon capture and storage attached to the natural gas power plants. The best-in-class technology achieves this for USD 40 per tonne CO2. One of the new California bills encourages this direction. It is opposed on the grounds that it encourages more fossil fuel production. True. But, as noted above, until carbon-free gap fillers are at scale, natural gas is the only practical alternative. Rock and a hard place.

The two plants at Diablo Canyon account for 9% of the electricity and 16% of the carbon-free electricity for the fifth largest economy in the world. Removing it would make already tough zero emission goals almost unattainable, certainly the 2030 ones. This state is currently in an epic heat wave causing power demand spikes. It is also the state most vulnerable to climate change driven forest fires. It can ill afford to take out any carbon-free capacity, especially if the concerns expressed on Diablo Canyon continuance can be met by other means.

Diablo Canyon nuclear facility at Avila Beach, CA. Source: NY Times. Credit: Michael Mariant/Associated Press

Enter the Stanford/MIT paper. It has explicit engineered solutions to minimize marine life extinction in the water procurement. It also has two other interesting suggestions to maximize the environmentally related value of Diablo Canyon. One is to use part of the output to desalinate seawater. The measures taken to protect marine life would apply here as well during the water acquisition. Since reverse osmosis produces a highly saline wastewater, the disposal in the ocean would need to follow means to minimize damage to sea bottom species. These are known methods and simply need adoption.

The other suggestion is to electrolyze water to produce hydrogen. This would be considered green hydrogen because the electricity was carbon-free. Power is employed in this way in Europe during periods of low demand. There they are piloting adding a 20% hydrogen cut to natural gas pipelines, to reduce fossil fuel use. A point of note is that the electrolytic process requires 9 kg fresh water for each kg hydrogen produced. While green electrolytic hydrogen is seductive, especially when using electricity during period of low demand, fresh water is in short supply in many areas, especially South/Central California. Could be a reason for the Stanford/MIT report suggestion regarding desalination at Diablo Canyon.

Aggressive decarbonization strategies will come with tough choices. An easy one is to target “carbon-free” rather than “renewable” energy. A harder one is to tolerate bridging methods, such as natural gas power with carbon capture and storage. The trick is to ensure that the bridges* are to definite targets. With sunset clauses.

Vikram Rao

September 4, 2022

*A bridge over troubled water, from Bridge Over Troubled Water, Simon and Garfunkle (1970)

WHAT IS HAPPENING IN THE ENERGY CAPITAL?

February 18, 2021 § 14 Comments

Texas prides itself on being the energy capital. The capital (as opposed to the Capitol of the infamous January 6 insurrection) is under siege.  Nature is asserting its might. Unpreparedness sure helps. 

Few know that Texas has its own grid.  The country is divided into three grids: The Eastern Interconnection, the Western Interconnection, and drum roll here, Texas.  Conspiracy theorists may connect this to secessionist tendencies.  Certainly, recent utterances attributed to the former governor Rick Perry don’t help.  He is quoted as saying, “Texans would be without electricity for longer than three days to keep the federal government out of their business,”.  He is referring to the fact that because the Texas grid does not conduct interstate commerce, it is not governed by the rules of the Federal Energy Regulatory Commission.  This from a guy who just a month ago held federal office as the US Secretary of Energy.  

In a Fox channel interview Governor Abbott of Texas blamed solar and wind for the problem.  Small problem: solar is just 1 – 3% of the total and wind is around 20%.  Then his own folks at ERCOT, which stands for Electric Reliability Council of Texas (the reliability in the name is ironic) said it was primarily due to natural gas supply drop.  This makes more sense because gas generators comprise 47% of the electricity produced.  Abbott later walked back the claims and said he meant that renewables could not be the dominant source.  Tell that to Germany, which gets 40% from renewables.  Then Congresswoman AOC trolled Abbott by Tweeting that Texas electricity was 80 – 90% from fossil fuel.  That is not accurate either (coal plus gas come in at about 65%, according to ERCOT).  Just when you think the election silly season is over, you have politicians using their favorite political points scoring issue whenever there is a remote opening for it.

By all accounts, every source of electricity was hampered by the extreme cold, even the nuclear plants.  But, according to the ERCOT leadership, the biggest culprit was natural gas.  Delivered natural gas nearly halved at the most severe stages due to frozen lines.  We know that methane (the dominant component of natural gas) does not freeze till a frigid -182 C.  So, why are natural gas pipelines (these are the main supply lines, not the little ones going to your house) freezing?

I was not able to find any explanation, so I am going to hazard a hypothesis based on other oilfield knowledge.  Almost all supplies of natural gas will be water wet to some degree.  If films of water form, at pipeline pressures of 800 psi or so, temperatures approaching water freezing can cause hydrate crystals to nucleate on the walls.  Again, with the right conditions, these could grow to plug the line.  This routinely happens in undersea gas pipelines.  Those pipelines have a device known as a “pig” which can be made to traverse the line and mechanically clear out the growing crystals.  The other means is to drizzle in methanol, which lowers the freezing point; basically an antifreeze such as ethylene glycol in your car radiator (which too can be used in this application).

Gas hydrates are large crystals of ice with methane in the interstices.  The overall crystal structure looks like a soccer ball.  Richard Smalley, who co-discovered this structure in carbon (a sixty-atom molecule version), got the Nobel Prize for it, in part because finding a brand-new crystal structure of a common element is rare, and in part because carbon in this form has proven to have compelling value in the form of nano materials.  Gas hydrates in the subsurface were once believed to be the next big thing in natural gas sourcing because they are ubiquitous and, according to the US Geological Survey, the total resource exceeds all other carbonaceous fuels combined.  Some probably still are believers.  In my opinion plentiful shale gas has shot down those dreams.  Gas hydrates are also a neat party trick. Take a block of it in a shallow bowl and the seemingly innocuous ice will light up with a match. 

We can conclude from all that we have seen in Texas that industry, especially a loosely regulated one, operates on probabilities.   ERCOT modeling probably predicted such freezes to be infrequent and more geographically scattered, allowing the management with a minimum of disruption.  Not the way it turned out.  Last year a high proportion of the devastating wildfires in California were known to have been triggered by downed power lines.  A cost-effective solution is yet to be identified.  The Lone Star is not alone after all.

Vikram Rao

February 18, 2021

Search Results

You are currently viewing the search results for wildfires.

%d bloggers like this: