January 12, 2021 § Leave a comment
January 6 will go down in infamy for the obvious, an extraordinary and unprecedented assault on the will of the people. But it was also a failed gambit. The failure of two other gambits was revealed on that day as well. Neither was in the same league, and with luck none ever will be. But they are certainly worthy of note because they are windows to human nature in the election season. In any other year, one of them would have been headline news. Until the storming of the Capitol, I used to refer to all that preceded every January certification day as the silly season. Levity does not wear well with the main event of that day. But the under card (minor apologies for the pugilistic analogy) events can tolerate a lighter touch.
The New York Times headline on the front page of the January 7 issue assigns culpability for fomenting the mob. Here we do not do that, sticking with a long-standing practice of this blog site to attempt to steer clear of positions in political matters. We merely discuss the gambit, no matter who instigated it, which appears to have been storming the Capitol with intent to somehow overturn the results of the presidential election. Whether this was to be with the simple imposition of will or the actual destruction of the ballot boxes containing the electoral college votes, we may never discern. This had all the hallmarks of rabble rousing, as opposed to a carefully planned and executed campaign. Having said that, history is pocked with instances of rabble rousing that took on lives of their own, with more lasting effects.
The other gambit was played in Georgia in early November and the result was revealed on January 6. This was a decision to hang onto President Trump’s coattails despite his having lost the election in that state. Both Senate incumbents decided to do just that, although Kelly Loeffler was more strident, pointing out her perfect record of voting with the President. Many others across the country had ridden that strategy to victory. In fact, in something of an anomaly, the coattails grabbers performed better than the coat wearer. But both Loeffler and Perdue did not reckon with the President litigating (both literally and figuratively) his election loss, by basically asserting that the outcome could be reversed. This took away the best argument for voting for Loeffler and Perdue, which was to preserve the Republican majority in a senate with Biden as president, an essential means for tempering the new president’s agenda. As the weeks wore on, he attacked the governor and secretary state of Georgia, both Republican, for somehow being responsible for his loss. The final straw was when he demanded support from both incumbent senators for a challenge of the electoral college votes in the January 6 certification in the joint houses of Congress. With the die already largely cast, they went along with the political version of a Hail Mary (touchdown) pass. Even that characterization is charitable given any reasonable interpretation of the Constitution. When the dust settled and the Georgia votes were counted on January 6, the same fateful day of the mob attack on the Capitol, both incumbents had been defeated in what most considered to be an upset verdict. The margins of victory were even wider than the 0.5% which could have enabled a recount.
The third gambit pales in comparison and deserves discussion principally because of the temporal coincidence of it playing out on January 6. But the gambit was conceived and announced in early 2017, shortly after Trump was elected president. It was also shortly after President Obama, literally on his way out of the door, issued an order to “permanently” ban offshore Arctic oil and gas lease sales. My opinion of that play is in a 2017 blog. For President Trump this was an attempt at showing support for the oil and gas industry. The only problem was that the smart money has known, at least since the plummet in oil prices in 2015, that the Arctic was too risky. On pure economics, leave alone considering the environmental impact. I opined a few months ago that the interest in the leases would be very low. But the gambit had to be played out. Possibly nobody told the President that this could prove embarrassing; is it still a party if (almost) nobody comes?
And it was embarrassing indeed when the sealed bids were opened on that otherwise fateful January 6. Only 11 of the 22 tracts offered had bids. None of these were from super majors, majors, or large independents. All were at or near the minimum allowed in the auction. 9 of 11 were won by a division of the state of Alaska, not an oil drilling and production entity. The other two were very small independent oil companies. All this in the name of supporting the oil and gas industry. If only they had been consulted.
January 12, 2021
December 6, 2020 § 1 Comment
Outgoing Presidents are making a habit of throwing long passes with the Arctic football. And like all long passes, the probability of being caught is low. But they differ in character. When President Obama issued an executive order about this time in 2016, it was to “permanently” ban future lease sales in US Arctic waters. At the time I wrote that this was purely symbolic, with no real impact, because even if leases were offered for sale, this would be a party to which hardly anybody would come. When President Trump assumed office, one of his first declarations was the intent to sell leases in the Alaska National Wildlife Refuge (ANWR). While this was not a challenge to the President Obama executive order per se, ANWR being on a coastal plain and not offshore, it too was symbolic, as demonstrated by the fact that nothing happened for the rest of his term. Until now.
The Bureau of Land Management (BLM) is rushing through the process of posting tracts for leasing*. The timeline they are following is much shorter than normal, reportedly to receive the bids prior to the presidential handover. A rushed process will not properly match up tracts with buyer priorities. This alone does not auger well for a heavily subscribed sale. But the primary reason for tepid interest is the market. Oil prices are low and uncertain. Gas prices are low and certain. This must be about oil alone. Anybody buying ANWR leases in gas prone areas would have a whole lot of explaining to do.
The 2017 Bureau of Ocean Management lease sale in the Gulf of Mexico was strong. Oil prices were low then and nearly as uncertain as now, not counting the Covid-19 induced depression. An important measure of oil company stock value is reserves replacement. Loosely, it is the ratio of new reserves booked to oil produced. So, they are always looking to find new reserves. But the Gulf of Mexico is well understood, with considerable seismic data and production experience. In comparison, the Arctic has a paucity of seismic data. And the results to date have been daunting. Shell walked away from the Arctic after spending a reported USD 7 billion, with not much to show for it. Admittedly, that was offshore in the Chukchi Sea, and ANWR would be different. But the hurdles will remain numerous. Environmental sensitivity and wells made expensive in part by the limited window of time for operations are just two. Financing will be hard to come by and legal challenges are almost certain to at least delay operations. This leaves just the very large oil companies with strong balance sheets and equally strong stomachs. The European headquartered ones, especially BP and Royal Dutch Shell, are very unlikely to plan an Arctic venture.
The need to increase reserves could be addressed in an unconventional manner: increased focus on tertiary recovery of oil. Simply put, conventional means usually can extract only about a third of the oil in place in the reservoir. Tertiary recovery utilizes CO2to wring more out. The gas is introduced into the formation in a supercritical state. Supercritical CO2 enters pores effectively as if in a gas state but reacts with the oil as if in a liquid state. Being twice as soluble in oil as in water causes preferential combination with oil. This mixture is pushed to the producing well due to high pressure in the injection well. The CO2is separated and re-injected. This process is repeated and each time more of the CO2 remains in the ground. Eventually, about 95% of the CO2 is retained in the formation and subject to the same trap mechanisms that kept the original hydrocarbons in place.
Some estimates have that by this method 50 billion barrels could be recoverable in the US at a USD 70 price per barrel of oil. By comparison, ANWR is estimated to hold 15 billion barrels, and I doubt those are economical at USD 70 oil price. If the CO2 were to be priced at USD 50 per tonne, the amount needed would roughly cost USD 11 per barrel of oil recovered. Since the finding, development and much of the lifting cost has already been incurred, this ought to be affordable at USD 70 oil. Part of the current 45Q Federal Tax Credit, which is USD 35 per tonne CO2 for such sequestration, would also be available. Since this is paid to the CO2 producer, who, together with the USD 50 paid by the oil company, would have USD 85 available for capture. This is right in the range of feasibility of the latest capture technology.
A recent life cycle study concludes that a barrel of oil from tertiary recovery represents a 37% CO2 emissions reduction compared to oil from conventional production (see reference below), provided the CO2 is sourced industrially. Arctic operations will, of course, have an additional layer of environmental risk associated with their location.
In some senses, here then is the choice for oil company strategists. On the one hand ANWR, costly and environmentally risky due to the location. Data paucity adds a layer of exploration and development risk. Uncertainty in the price of oil renders profitability in doubt. On the other hand, tertiary recovery using industrially sourced CO2. Negligible operational risk and you already own the lease. At scale it makes a serious dent in industrial CO2 emissions, the marginal costs are low per barrel of oil produced, and the surface environmental footprint is much smaller compared to new areas explored and produced.
Reason dictates that the Arctic ought to remain icy cold.
*Dream on in “Dream On” by Aerosmith (1973) written by Steven Tyler
December 6, 2020
Núñez-López V and Moskal E (2019) Potential of CO2-EOR for Near-Term Decarbonization. Front. Clim. 1:5. doi: 10.3389/fclim.2019.00005
November 30, 2020 § 2 Comments
A recent New York Time story discusses an EPA report of diesel pickup truck owners disabling emission controls on their vehicles. The scale of the deception is reported as being the emissions equivalent of putting nine million extra trucks on the road. There appears to be a cottage industry of small companies devising schemes to manipulate the emission controls systems. The term cottage industry likely does not do it justice when one realizes that, according to the EPA, since 2009 more than half a million trucks have been involved. Yet, each of the providers is usually a small player. In 2019 the EPA acted against 48 companies, bringing their decadal total to 248.
The motivation for these actions is not to despoil the environment. It is to improve fuel efficiency and/or engine performance such as (higher) torque. Diesel vehicles are singularly (dis)credited with high production of NOx and particulate matter (compared to gasoline vehicles). In the case of particulate matter (PM), this comparison is strictly valid only when the measure is micrograms per cubic meter and not particle count. But that is a hair to be split another time. Mass based measurement is the only regulatory approach at present.
PM from diesel engines is captured in Diesel Particulate Filters (DPF). After a couple of hundred kilometers of operation, the filter starts clogging. This causes back pressure on the engine and reduces engine performance. Engine manufacturers set back pressure limits at which the filter must be regenerated. This is accomplished by running the engine hot, which produces more NO2, which catalytically oxidizes the carbon deposits. But if the engine duty cycle simply does not achieve these temperatures, the filter will remain clogged. Sometime after, it will cease to function and will need to be replaced. This is a clear example where circumventing the DPF benefits the consumer in performance and avoidance of the cost of premature replacement of the DPF. Enter small player who replaces the DPF with a straight pipe for a fee.
NOx is short for some combination of NO2 and NO. It is harmful to human health directly, and indirectly through ozone formation. Ozone is formed when NOx reacts with organic molecules and sunlight in the atmosphere. Ozone, often erroneously equated to smog, is a component of smog and injurious to health, with children especially vulnerable. NOx is produced when nitrogen in the combustion air is oxidized. Higher engine temperatures create more NOx. The high temperatures needed to regenerate DPFs have this undesirable effect. This is an example of the balancing act required in emission controls. One remedy for this particular trade-off, present in some automobiles, is an auxiliary heating element in the DPF, which fires up at appropriate intervals.
NOx control is addressed in one of two ways. The more foolproof one is Selective Catalytic Reduction (SCR), in which urea is injected into the exhaust. Urea decomposes into ammonia (in two possible steps), which catalytically reacts with NOx to reduce it to nitrogen. Engine performance is unaffected. But the urea canister must be replaced periodically, which consumers see as a nuisance. The system occupies space, so this tends to be in bigger engines (over 3 L displacement).
The alternative, more targeted to smaller engines, is the Lean NOx Trap. Here, the NOx is captured by adsorption onto a surface. Desorption (removal from the surface) is achieved by chemical reduction by exhaust gas components such as uncombusted diesel, carbon monoxide and hydrogen. In an efficiently running engine, these will, by design, be in short supply. Accordingly, the control system deliberately causes the engine to run rich (oxygen starved) to produce these reducing gases. While this achieves the purpose of reactivating the NOx capture, during these intervals, the engine runs with lower power and is fuel inefficient. The fuel efficiency reduction overall is in the vicinity of 3 – 5 %. As a frame of reference, this is the reduction gasoline engines see by the addition of 10 – 15% ethanol in the gasoline, and there is no consumer push back, perhaps because it is not accompanied by a performance reduction. Yet, in diesel pickup trucks, it appears some combination of that and loss of torque is responsible for the public buying avoidance schemes.
Avoidance schemes are known in the industry as “defeat devices”. The practice is so rife, it has a name. Vehicle makers have in the past cut these corners, the most infamous being the VW cheating episode. But this piece is about aftermarket interventions, which are harder to corral, in part due to the sheer number of companies involved. This proliferation is due to the ease of devising the defeat devices. While every state has somewhat different methods for emissions testing, a common one is to simply query the onboard computer for clues on functioning of the emissions system. Altering the onboard programs is enough to produce the avoidance. In other cases, the emissions are tested directly while on a chassis dynamometer (think stationary bike). With such testing, the avoidance must be more sophisticated, as it was in the VW case. Then there is the simple substitution of the SCR device with a tube. A mechanic with a reasonable modicum of competence could execute this.
The defeat devices cottage industry would not exist were there not a market for the product. The pickup truck owners simply want the performance* and may well consider the infraction to be minor. Curiously, they would not be too wrong on the DPF avoidance. The regeneration step described above is combustion of the filter cake and does produce carbon particles. However, the general public is unlikely to be aware of the intricacies of operation of the DPF and the impressions are probably grounded in the beliefs of the truck owner community.
Although more than likely aware of the illegality, truck owners are probably aware that individuals are never targeted by the federal government. In many ways, this is more pain-free than a speeding infraction.
*I can’t get no satisfaction from “(I can’t get no) Satisfaction” by the Rolling Stones (1965), written by M. Jagger and K. Richards
November 30, 2020
November 19, 2020 § 2 Comments
Less than a week apart, two leading Covid-19 vaccines were announced as being more than 90% effective (Federal approval requires greater than just 50%). These numbers auger well for broad scale combat of the virus. Does this mean that a slew of other vaccines in development are also likely to emulate this success? Not necessarily. Depends on the approaches the others take. These two use a relatively new approach which took less than a year from when they began to the seeming end of a Phase 3 trial, albeit limited in size; unheard of speed in the firmament of vaccines.
Conventional vaccines introduce inactivated pathogens which trigger an immune response. Being inactivated by heat or chemical means, they cannot cause the disease. But getting them to a safe while effective formulation takes time. The immune system retains the memory of this invasion (acquired immunity) and when a true disease pathogen is detected, activates the defense mechanism. Acquired immunity is retained for some period. This is also the reason that acquiring the disease can provide immunity for years in many cases. Edward Jenner, the English physician responsible for smallpox vaccination, is credited with coining the term vaccination, derived from the Latin vacca for cow and, vaccinia for cowpox. Famously, infection with the relatively benign cowpox conferred immunity from the deadly smallpox disease.
SARS COV-2, the virus creating the Covid-19 disease, was sequenced in early January 2020 by Chinese doctors. This was made available worldwide. The virus is about 120 nano meters in size and has an external lipid layer. Thrusting through the lipid layer are spikes, known as the spike protein (see image).
SARS-CoV-2 transmission electron microscopy image, courtesy NIAID-RML
This protein was an obvious target for the vaccine. In a conventional vaccine it would be only one of many proteins eliciting antibodies. To focus on just the spike protein, investigators turned to a messenger RNA (mRNA) approach. As the name implies, mRNA conveys the genetic message to the cell to direct synthesis of a specific protein. This specificity likely minimizes the possibility of side effects.
Here is how mRNA-based vaccines work. Investigators devised an mRNA strand which could reliably direct the production of the spike protein when injected into a cell. This protein is known as an antigen. The antigen will collect on the periphery of the cell, referred to as the antigen-presenting cell. The antigens on the cell elicit an immune response (primary response) from the body. This activates cells which acquire a memory for this detection and response in what are known as memory cells. When the person is infected with the SARS COV-2 virus, and the pathogen presents, the memory cells cause the immune response to kill the virus.
Instability of the mRNA strand is a concern at many stages in the process. It will not survive long in the body, so ensuring cell entry requires that it be encapsulated in some way. The most common means is a lipid (essentially fat) capsule. Entry into the cell is facilitated and the lipid capsule preserves it for the duration prior to that. Once in the cell, the mRNA strand directs the production of the antigen and then degrades and is no longer a factor in the body.
In lab settings, mRNA not tailored to be more stable requires storage at -70 C. Most laboratories employing these methods have such freezers available. When the mRNA needs to be used for experiments, it is often temporarily stored on the bench top in containers with liquid nitrogen or even dry ice (solid CO2). But long-distance transport is still an issue, especially in low- and middle-income countries (LMICs). Thus, distribution to LMICs is more complicated than simply the cost of the vaccine. These countries are best served by vaccines that are stable at closer to ambient temperature. mRNA can be designed to more thermally stable, especially through improved encapsulation. Currently, we know that the Pfizer/BioNTech vaccine requires the conventional -70 C for storage. The Moderna vaccine is stated to be stable at -20 C, and for some duration at refrigerator temperatures up to +5 C.
A dark horse that I have not seen discussed in the press is CureVac, a German firm (as is BioNTech) in Tübingen. It is expected to commence Phase 3 trials at the end of this year, whereas the other two are already seeking emergency approval. What makes CureVac distinctive is that their emphasis has always been on thermal stability. This characteristic is what triggered a (pre Covid-19) USD 52 million investment in 2015 from the Bill and Melinda Gates Foundation, which is known to be particularly interested in health solutions for LMICs. In fact, the investment agreement requires low cost availability of vaccines in LMICs. CureVac claims 3-month stability at +5 C and 24 hours at room temperature. The tortoise could overtake the hares, at least in LMICs.
Break on through from “Break on Through (to the other side)”, written and performed by The Doors (1967).
November 19, 2020
October 22, 2020 § 8 Comments
The future of oil has been debated for ever since I can remember. When I was an undergraduate in engineering in the early sixties, we were taught that the world would run out of oil in 30 years. Such predictions continued with the concept of Peak Oil oft discussed. But, with the recognition of immense heavy oil reserves, and more recently with the emergence of shale oil, the discussion has shifted to the demand side.
For nearly a century all crystal ball gazing centered on sufficiency of a strategic commodity. Over the last decade or so, oil is well on its way to turning into salt*. Lest you conjure alchemical imagery, I hasten to explain that oil is merely going the way of salt. Salt used to be a strategic commodity. Canning, and later refrigeration turned it into a useful commodity, no longer strategic. This was about the time that the era of oil began, with the discovery of Spindletop and the resultant decimation of the price of oil. The era was abetted by the demand created by mass production of economical cars by Ford, which incidentally killed the auto industry of the time: electric cars. More on the revenge later.
But the demise of oil will be preceded by a protracted hospice stay. Folks will predict X% electric cars by year Y. But that will be for new vehicles. Legacy vehicles will go a long time, especially in countries like India, a major developing market for automobiles. The electric starter was first installed in a Cadillac in 1911. I was still hand cranking our venerable Morris 8 sedan in India (with difficulty; I was 6) in 1950. On the other side of the coin, India is more amenable to conversions to electric drive, in part due to low labor cost and in part due to a way of life that wrings out every drop of value in a capital asset.
The future of oil is now being discussed relative to demand, not so much supply. Peak oil discussions are replaced by peak consumption ones. Shale oil put paid to the supply issue. Even before Covid-19 destroyed demand, a groundswell of movement was present towards oil alternatives for transportation fuel. This was driven by climate change concerns, but also to a degree by emissions such as NOx and particulate matter. But the projections on future demand depend on the tint of the glasses worn. The Organization of Petroleum Exporting Countries (OPEC) is predicting return to pre-Covid levels of consumption by late next year. Somewhat surprisingly, the US Energy Information Administration is also singing that tune as are some oil majors such as ExxonMobil.
Most surprisingly, however, British Petroleum (BP) is very bearish. Their projections, while being scenario based, are causing them to plan a 40% reduction in their oil output by 2030. This is to be combined with a big uptick in renewable electricity production. Shares rose on the announcement. But BP has been contrarian before, along the same lines. Over a dozen years ago they announced a pronounced shift away from oil, renaming BP to stand for Beyond Petroleum. That did not go well. Particularly unhelpful to their reputation for operating in difficult environments was the oil spill associated with the massive Macondo blow out.
The future of oil is not the future of natural gas. Together they share the term petroleum, although it is imprecisely used in the parlance to stand simply for oil. They were both formed in the same way, with natural gas being the most thermally mature state of the original organisms. But in usage they are different. Oil is mostly about transport fuel and natural gas is mostly about fuel for electricity generation and the manufacture of petrochemicals, especially plastics.
The pandemic decimated transportation fuel but had much smaller effects on electricity and less again on plastics. In the post pandemic world, natural gas will endure for long, while oil will be displaced steadily by liquids from natural gas and biogas, and ultimately by electricity. This, of course, excludes aircraft, which will need jet fuel for the foreseeable future. Biomass derived jet fuel will be a consideration, but not likely a big factor.
Electric vehicle batteries costing USD 100 per kWh will be the tipping point, and we are close. At that level, the overall electric vehicle with modest range will cost about the same as a conventional one. The battery and electric motors’ cost will be offset by the removal of the IC engine, gear box, transmission, exhaust systems and the like. For a compact car, each 100 miles in range will add about USD 2500 to 3000 to the capital cost. Maintenance costs will plummet and the fuel cost per mile will be significantly less than with gasoline or diesel. To top it off, the linear torque profile typical of electric motors enables high acceleration from a stop. A progressive shift is inevitable. The revenge of the electric car.
The only debatable issue is the rate of change. And this is where the opacity appears in the future of oil. The main sticky bits are perceptions of range required (and the willingness to pay for more) and charging infrastructure. The latter could be influenced by business model innovation, such as battery swapping rather than owning. But oil is here to stay for decades. Therefore, improvement in efficiency, to reduce emissions per mile, are paramount. The industry appears to understand that. When the US administration announced a drastic relaxation of mileage standards in 2025, four major companies voluntarily agreed to a standard close to the old one. I suspect this was in part because they already had worked out the techno-economics to get there, and certainly the consumer would like the better mileage. Could be also that they had projections of electric vehicle sales that allowed fleet averages to be met. A compact electric vehicle has a gasoline equivalence mileage of about 120. Quite an offset with even a modest fleet fraction.
The oil barrel has sprung a leak. But it is likely a slow one.
October 22, 2020
*Turning Oil into Salt, Anne Korin and Gal Luft, 2009, Booksurge Publishing
October 8, 2020 § Leave a comment
Several readers of my wildfire blog suggested that I had given lightning strikes short shrift as causes of forest fires. In fact, lightning as a cause is not even mentioned because it is not in the top four causes, at least until 2015, as reported in the the paper I used, which is cited at the end of this piece. They plot the proportion of number of fires and area burned by cause of ignition in (a) the Santa Monica Mountains and (b) San Diego County. I am not reproducing the plot here for copyright reasons but interested folks can go to the link in the citation (Syphard and Keeley 2015) below.
Most such studies concentrate more on the area burned, than numbers of ignition, presumably because that is the net effect on the public and on the environment. Lightning as a cause is present in…
View original post 655 more words
September 21, 2020 § 2 Comments
California is ablaze. So are Oregon and Washington. The tally to date is 5 million acres burned, about halfway through the fire season, and well on its way to record territory. Putting that in perspective, the east coast of Australia, devastated similarly earlier this year in the Southern Hemisphere summer, closed the season with 46 million acres burned.
The statistic of greatest concern is that the intensity and scale of the fires is getting worse. Over the last thirty years, the number of fires annually has no discernible trend; certainly, has not gone up. But the acreage burned has; decisively. Both patterns are evident in the figure below. Five of the ten largest fires ever in California are currently active. The largest of these, the August Complex is already at 839,000 acres and still going. The next largest, ever, was 459,000 acres, the Mendocino Complex in 2018. Labeling any of this chance, or poor forestry management, evokes imagery of the proverbial ostrich, and the placement of its head.
The average hectares (a hectare is roughly 2.47 acres) burned has nearly doubled over this three-decade period. Nine of the ten largest fires have occurred since the year 2000. Note that this does not include the ongoing five, which certainly would be in that group, making it 14 of the 15 since 2000. Although a regression line would have uncertainty due to big annual swings, an eyeball estimate indicates a strong upward slope. If this is a predictor of the future, that future is indeed bleak and warrants a study of causes.
The recent EPA report, from which the figure was reproduced, ascribes the pattern of increased fire acreage to higher temperatures, drought, early snow melts and historically high fuel loading (which is the fire prone vegetation, including underbrush). We will examine these separately, although they may not be disconnected. But first, a comment on the pattern of numbers of fires being essentially flat. Ignition events determine numbers of fires. In California, the principal ones are arson, campfires, power lines and equipment. The equipment category comprises items such as power saws, mowers, and other operated machinery. Human behavior, absent intervention, can be expected to be constant. So, the flat profile on numbers of fires is to be expected. Interestingly, the incidences are seasonal, even, counter-intuitively, arson.
Climate change is implicated in many of the causes of increasing severity over the years. While the term has many interpretations, one generally accepted aspect is temperature rise in the atmosphere and in the oceans. The debate is not whether this happens, but how fast it does. Also generally accepted (to the extent any climate change causality is generally accepted) is that oceanic temperature rise causes increased severity in the El Niño phenomenon in the Pacific Ocean, which is responsible for catastrophic droughts. These are accompanied by drenching rains in other parts of the world in the same year. Both disturbances are extreme deviations from the norm, with resultant impact on vegetation and the way of life.
Atmospheric temperature rise can also be expected to change the proportion of rain and snow in precipitation. Lighter snowfall can be a result, as also early snow melts. Both are in the EPA list noted above.
California, being generally arid, gets most of its water supply from melting snow. While less snow in a given year is certainly a drought indicator, the phenomenon most studied is that of the timing of the snow melt. Data from four decades commencing in 1973 conclusively demonstrated that burn acreage was strongly correlated with the earliness of the snow melt (Westerling 2016). Decadal comparisons show that the fire seasons in 2003-2012 averaged 40 more days that the seasons in 1973-1982. Fires in the later decade were more severe. Large fires, defined as covering greater than 400 hectares, burned for 6 days on average in the early decade and for more than 50 days in 2003-2012.
Power lines deserve special mention. Falling power lines were blamed for several fires in 2017 and 2018. The utility has accepted blame and is in bankruptcy. Trees falling on power lines snapped the poles. The tree roots, finding uncertain purchase due to drought conditions, were no match for the Santa Ana winds or any other storm sourced shoves. Those same drought conditions caused the underbrush to be dry. Power lines are usually not insulated. Sparking wires on dry underbrush and the rest is incendiary history. A poster child for distributed power.
The wildfire future is indeed bleak. Climate change retardation is necessary. But it may not be sufficient in the shorter term. We need a reincarnation of Smoky to change human behavior to minimize the ignition events.
Westerling, A. L. (2016) ‘Increasing western US forest wildfire activity: sensitivity to changes in the timing of spring’, Philosophical Transactions of the Royal Society B: Biological Sciences, 371: 20150178. http://dx.doi.org/10.1098/rstb.2015.0178
Vikram Rao September 21, 2020
August 27, 2020 § 2 Comments
A recent story in the NY Times describes four auto makers making climate action policy. OK, so that is a bit of a stretch, but not by my much. BMW, Ford, Honda, and Volkswagen cemented a binding agreement with California to limit tailpipe emissions. Volvo has also agreed to join the other four. This deal was struck despite a Trump administration’s roll back of the Obama era regulation, which had called for fleet averages of 54 miles per gallon (mpg) by 2025. The current target is 38 mpg, and the roll back regulation calls for a mere 2 mpg increase by 2025. The new California deal is for 51 mpg by 2026. The auto makers could have settled for the low target and beaten it handily. But, instead, they chose to take the high road.
This is not the behavior one ordinarily expects from industry. But these are nor ordinary times. Recently, BP announced its intention to reduce oil production by 40% by 2030, planning to produce 50GW renewable electricity by the same date. Not much detail was given and yet the share price rose in response. Green energy resonates with investors, it seems.
Much the same happened when the Trump administration rolled back yet another Obama era environmental protection scheme, regulations ensuring the curbing of fugitive methane emissions. The roll back was opposed by Shell, ExxonMobil, and BP. Methane leakage occurs at various stages of the natural gas production and distribution system. A multi-year study led by Prof. Allen at the University of Texas, Austin, sponsored in part by EDF, has identified the sources of leakage and industry has taken steps to ameliorate. This has been aided by the fact that most of the actions are cost neutral. Natural gas that leaks is natural gas that cannot be sold, so the motivation is not just environmental. Much progress has already been made, especially by the larger oil companies. A reversal of the regulation has no benefit to them. In fact, their opposition to the roll back is in part because they want to represent natural gas as a clean fuel and allowing leakage undercuts that message.
In inking a deal with California, those five auto companies did more than merely push back. They inserted themselves into a federal versus states’ rights dispute. And this was an “in your face” move in response to the roll back, which was intended to “produce far less expensive cars for the consumer, while at the same time making the cars substantially SAFER.” The administration’s move appears to have been designed to be a win for both the consumer and the producer. The only loser was the environment. However, the auto companies must believe that the net cost of ownership will be lower with the new targets. The modest capital cost increase would be offset by the lower cost of operating a fuel-efficient vehicle. Direct fuel injection, combined with temperature measurement and multiple injection capability, can dramatically increase the compression ratio. Increased mileage combined with more muscle; a winning combination that will sell.
I suspect, however, that the bet they are placing is not only on the type of improvement I note above. This is a vote of confidence in electric vehicles, as was BP’s in the move away from oil production to generating renewable electricity. If a substantial portion of the new fleet comprises electrical vehicles, it takes some of the pressure off on improving the current fleet. An electric family sedan can be expected to have an effective “miles per gallon” between 110 and 120. But 2026 is not that far away, and possibly too soon for a major shift to all-electric vehicles. Then again, maybe not. I found the shift of the target date from 2025 to 2026 interesting, indicating a level of granularity that presages a firm plan. Hybrid vehicles already deliver the 51 mpg target. Direct injection combined with elements such as turbocharging are proven mileage enhancers.
One of the drivers for industrial responses to the Trump administration’s roll backs on Obama era environmentally themed rules has been a desire for stability in regulation. Change potentially every four years is disruptive. Logic dictates that this desire will apply primarily to those affecting capital investment with long cycles. Certainly, automobile design changes and oil exploitation of the Alaskan National Wildlife Refuge fit that bill. Methane emissions somewhat less so, but still involving changes in equipment and procedures. Warning labels on machinery probably lie on the other end of the spectrum.
California represents a solid 30% of the automotive sales market. Thirteen other states are committed to California regulations and add to that percentage. Unless President Trump succeeds in taking away states’ rights in this regard, that pretty much means the entire country will be using these vehicles, no matter the regulations in the individual states. This has happened before. Devices such as television sets consume power while on standby. A scant 15 years ago, that could have been as much as 20 watts. Based on research conducted at the Lawrence Berkeley National Laboratories, it was concluded that the full functionality could be achieved for as little as 1 watt, at virtually no extra cost; you just had to design it more smartly. California mandated that in 2006 (I think it was 3 watts at the start) and the rest of the country followed. At that time, a full 10% of all national electricity consumption was this sort of wasteful passive use. Since then, we have taken other actions to reduce passive consumption.
It began with the 1 watt rule, unilaterally laid down by California. Perhaps that bell will get rung again in tailpipe emissions.
August 28, 2020
August 20, 2020 § 4 Comments
Oil drilling leases will soon be available in the Arctic, according to a story in the New York Times. The Alaska National Wildlife Refuge (ANWR), a land-based portion of the Arctic, is cited. But the Arctic is cold, both figuratively and literally. When he took office in 2017, President Trump announced a roll back of a “permanent” ban on Arctic drilling that President Obama instituted as he was leaving the White House. I opined then that the roll back would have no net effect because interest from oil companies would be minimal. I also wrote at the time that President Obama’s action was also largely symbolic, and not material.
The principal reason for these conclusions is that the price of oil has been low since 2015, when US shale oil became the determinant of oil price in the world and the ability of the Organization of Petroleum Exporting Countries (OPEC) to prop up prices was deeply undercut. USD 120 per barrel highs became USD 70 highs. The Covid-19 pandemic has decimated shale oil company ranks, but it has also caused demand, and price, to plummet to historic levels. Accordingly, the crystal ball of future oil prices is murky. Murky crystal balls equate to uncertainty, which, added to the environmental risks, further equates to higher discount rates. Making matters worse on the investment side, any Alaska play has a long-term payout. First oil is likely a decade after the lease purchase. This involves forecasting the price of oil into the second half of the century.
All the indications are that oil demand will reduce significantly by 2040, largely through electric vehicle adoption. Certainly, the super-major oil company BP’s beliefs in this regard have translated into plans for a major replacement of oil revenue with revenue from renewable electricity. They recently announced that by 2030, their oil production will be reduced by 40%, concurrent with major investment in renewables, resulting in 50 GW electricity production. That production is up there with good size electric utilities. This decision also comes at a time when the dividend has been halved and properties divested to raise cash. It also is coincident with the divestiture of their pioneering Alaska North Slope holdings to privately held Hilcorp, during which transaction they sweetened the pot with a loan to ensure closure of the deal. This does not sound like a company that will invest in a US Arctic lease. I do not see any oil company headquartered in Europe doing it either.
Hydrogen is an important industrial commodity even not counting the possible use as electric vehicle fuel. US refineries purchase 2 billion cubic feet per day of hydrogen (in addition to using another 0.5 billion cubic feet produced internally). Virtually all of it is produced from natural gas. As we discussed in these pages earlier, hydrogen produced using surplus electricity during low demand periods is one of the most promising solutions for the problem of intermittency of renewable electricity. Oil companies like BP, doubling down on renewables, are unlikely to miss this point. Also, if conversion to ammonia is more appropriate for storage and transport, who better positioned than an integrated major oil company? In its announcement, BP makes a vague reference to hydrogen. No mention is made of geothermal electricity, but it is highly unlikely they are not watching that space.
Returning to the issue of success of a lease sale in the ANWR, one of the primary challenges is the paucity of high-quality seismic data. These are subsurface images acquired by individual oil companies in proprietary shoots or by seismic operators speculatively shooting to then sell subscriptions to the data in “libraries”. The acquisition and interpretation of the data is the edge employed by oil companies in obtaining the winning bids without overpaying. Less data means more uncertainty. My take on the situation is that there will be fewer bids due to competing capital spend directions, the uncertainty in the price of oil, the environmental risks, and the delays likely due to litigation (case in point the litigation based delays in the Keystone XL oil pipeline construction). But whatever bids that materialize are likely to be low-balled. In that case, the revenue from the sale will be underwhelming. This assumes, of course, that the administration goes ahead with plans to auction the tracts. More than likely this is just another tempest in the Alaskan teapot.
August 20, 2020
August 9, 2020 § 4 Comments
This discussion is about fixing, as in solving, but it is also, and mostly, about fixing as in rendering immobile. The impact of the greenhouse gas CO2 can be mitigated either by producing less or by capture and storage. A recent paper in the journal Nature triggered this piece. It discusses the feasibility of fixing CO2 in the form of a stable carbonate or bicarbonate by reacting atmospheric CO2 with minerals in the volcanic rock basalt, one of the most ubiquitous rocks on earth. Crushed basalt is to be distributed on farmland. The bicarbonate fraction is water soluble and run offs take it to the ocean, where the alkalinity mitigates ocean acidification. The reaction products are also a desirable soil amendment. This paper is mostly not about the technology. It studies scalability and the associated economics. The authors estimate the process can be accomplished at a cost ranging from USD 80 to 180 per tonne of CO2. Putting that in perspective, the current US regulation has a 45Q Federal Tax Credit of USD 50 per tonne sequestered in this fashion. This lasts for another 12 years. While no business ought to be built on the promise of subsidies, the length of time allows cost reduction to occur. At USD 80, the lower end of the range noted by the authors, the cost is in an acceptable range.
The use of basalt to fix CO2 is a part of the genre referred to as mineralization of CO2. Divalent species, but principally Ca and Mg, are present in rocks. In low pH conditions they react with CO2 to produce a carbonate (or bicarbonate). Olivine, another common mineral, often found in association with basalt, is a mixture of MgO.SiO2 and FeO.SiO2. The reaction product is MgCO3 and SiO2. For CO2 sequestration purposes this may be accomplished in situ or ex situ. The term sequestration most properly includes both capture and storage, but is often used just for the second step, and that is how we will use the term here.
A promising approach for in situ storage of CO2 is injection into oceanic basalt deposits. Basalt is formed when the magma from volcanic eruption cools rapidly. When it cools slowly, it produces species such as granite, with large crystals and high hardness, a rock more suitable for structural applications. Basalt on the other hand is fine grained and weathers easily. This is good for reactivity. In oceanic deposits it is even more so the case when the rapid cooling in water results in “pillows”, which partially disintegrate to be permeable. They are often overlaid with later placements of magma sheets. These impermeable layers act as barriers to injected CO2 escaping, affording time for mineralization. The mineralization is further accelerated if the injected CO2 is in the supercritical state (achieved at greater than 31 oC and 1070 psi). All fluids in this state have properties of both gas and liquid. Here the supercritical CO2 permeates the rock as if it were a gas and reacts with the mineral as if it were a liquid.
Ex situ fixing of CO2 follows the same chemistry as in situ, at least in the aspect that the product is a carbonate. The raw material can be tailored to the need if cost permits. The CO2 capture cost is the same in either case. However, an ex situ process has many advantages over in situ ones. The process kinetics can be advanced using higher rates of reaction using standard process engineering methods such as fluidized beds. Catalysis could also be employed. The products could also be expected to have value, such as in substitution of concrete ingredients. But, as in the case of fly ash from coal combustion, also a simple additive to concrete, the realization of that value can be elusive. Niche uses can be found, but monetization on the massive scales required to make a dent in climate change will require concerted effort.
The cost of production will still dominate the economics and the largest component of that is the acquisition of CO2 from the industrial combustion process or air. Air capture is a relatively recent endeavor and targets production cost of USD 100 per tonne CO2, at which point it becomes extremely interesting. The principal allure of this method is that it can be practiced anywhere. If located near a “sink”, the utilization spot, transport costs and logistics are eliminated. This underlines a key aspect of ex situ sequestration, the availability and cost of CO2 in the form needed.
The original premise for this discussion, mineralization of CO2 from the air, skips the CO2 acquisition constraint. But the focus shifts to the procurement of massive quantities of rock and crushing into small particles. Two pieces of good news. One is that basalt is possibly the most abundant mineral on earth, although a lot of it is at ocean bottoms. The other is that basalt crushes relatively easily, especially if weathered (contrasted to its country cousin granite). But the elephant in that room is that procurement still involves open pit mining, anathema to environmental groups. In recognition of this, the authors of the cited Nature paper encourage a study of availability of tailings from mining operations as basalt substitutes for oxides of divalent ions. They opine there are vast hoards of such tailings from mining operations over the years. They also suggest the use of Ca rich slags from iron making. These are oxides of Ca and Si in the main, with some oxides of Al. Lest this idea be extrapolated to slags from other smelting operations, a caution: the slags from some processes could have heavy metals and other undesirables such as sulfur. On the plus side of that ledger, the processing of certain nickel ores entails a beneficiation step that results in a fine-grained discard rich in Mg silicates, which ought to be very reactive with atmospheric CO2.
While the use of industrial waste for sequestering CO2 is technically accurate, acquisition and use of alkaline earth rich oxides will have hurdles of location, ownership, and acceptability to farmers, to name just a few. I am also reminded of the fact that when “waste” products with no or negative value create value for someone else, the price will often be revised, upwards. But the method in the cited paper certainly is a useful addition to the arsenal of measures to mitigate global warming, provided field operations verify the predictions on rates of reaction. This battle will only be won with many different arrows in the quiver.
August 9, 2020