September 10, 2021 § 1 Comment

Low-cost energy lifts all boats of economic prosperity. Or on the other end of the spectrum, high-cost energy threatens to sink them, especially if prices rise suddenly. Nowhere is the positive scenario more evident than in Iceland. An otherwise resource poor country, cheap energy has elevated it to the third highest gross domestic product in the world. Unlike Norway, a country at a similar latitude, almost all produce is domestically derived. Greenhouses enabled by natural hot water operate for much of the year.

Iceland has the good fortune to be sitting on the Mid Atlantic ridge between the North Atlantic and Eurasian plates. Surrounded by volcanoes, the earth stresses are such that most eruptions are through fissures, unlike those in Hawaii and other places with typical conical protrusions with violent eruptions. Furthermore, with abundant subsurface water and high thermal gradients (subsurface temperatures that rise faster than normal with depth), hot water rises in the faults and emerges on the surface as geysers, or mere hot water lakes.  This hot water supplies heat for 90% of the homes.  It also is used to produce electricity, although in that case the water is from wells drilled a couple of kilometers. I estimate their cost to produce to be well under 2 US cents per kWh. They charge industry 5 cents and domestic users pay 13 cents. Clearly, tariffs are involved. But this compares to Netherlands and Germany at nearly 30 cents for domestic users.

A recent New York Times story reports a different scenario for the rest of Europe (yes, Iceland is in Europe) in that rising natural gas prices in Europe are slowing the post-pandemic economic recovery.  Natural gas prices are reported to have risen to USD 18 per MMBTU.  The pre-Covid 19 figures used to roughly fall out as follows: the US at USD 3, Europe at USD 9 and Japan at USD 17. The US is still low at USD 5, but that is the highest in nearly a decade. Abundant shale gas has kept the price down, but that industry has been battered by the pandemic, so is probably slow to respond to the surge. Remember also that shale gas driven low energy cost was the single biggest factor for US recovery from the recession of 2009.

What we can expect

On natural gas price, in one word: volatility*.  The USD 18 per MMBTU reported today as the spot price for Europe is probably aberrant.  The price was a third of that a few months ago. Also, most utilities operate on long term contract pricing. The high spot price is almost certainly driven by liquefied natural gas (LNG) import prices.  Drought conditions in countries such as China have reduced hydroelectric output and required augmentation with LNG powered electricity. Parenthetically, yet more evidence of impact of climate change. Usually, the spot price is determined by the price of the last cubic foot of gas imported.  For Europe, that is LNG. The winners here are the Russians with pipeline supplied gas, if the contracts allow escalators.

Again, most LNG contracts are long term and pegged to the price of oil.  In the US, most (all?) are based on the Henry Hub spot natural gas price.  It is multiplied by 1.2 and liquefaction cost of USD 2.50 is added to it. Today at USD 5 prices, LNG would be at USD 8.50, a far cry from the spot price in Europe of USD 18.  And so it goes in the commodity trading market.

I would expect US shale gas drilling to pick up in response to the price.  The smaller players, who are fewer yet after bankruptcies last year, will respond. Now that so many properties are in the hands of the major oil companies, expect their response to be more measured.  In any case, I expect US prices to stabilize and there is no serious risk of prices rising to the point where coal has a resurgence. Unlike in the case of oil, all gas pricing is regional. LNG is the only means of transport between regions and it adds a cost of somewhere between USD 3 and 4 to the produced gas price.

The experts are predicting a colder than normal winter.  If that transpires in Europe, the proverbial Katie will have to bar the door on natural gas prices.

Vikram Rao

September 9, 2021

*Everybody look what’s goin’ down from “For What It’s Worth” by Buffalo Springfield (1967), written by Steven Stills.

Wildfires Are Not Just a West Coast Problem

July 27, 2021 § Leave a comment

The Bootleg Fire (New York Times, July 23, A11) is a public health concern for much of the country. While the devastation from burning property is local, the emissions can travel thousands of miles. These emissions primarily comprise particles of solid carbon (soot) coated with organic molecules. Particles under 2.5 microns in diameter (a micron is one millionth of a meter and a human hair is about 70 microns in diameter) are designated PM2.5 and are more toxic than the larger PM10. This is because they can penetrate deep into the lungs. Regulations worldwide focus on PM2.5.

Smaller particles travel further because they are lighter. The total concentration of particles distant from the fire will be lower than near the source. But these particles will largely be PM2.5, also known as fine particles. Even smaller particles, known as ultrafine or nanoparticles, are a subset of these, and are referred to by scientists as PM0.1, indicating particles smaller than 0.1 microns. Strong evidence points to these particles being even more toxic than the rest of the PM2.5. There are two reasons for that. One is that the size allows them to penetrate individual cells in the organs, carrying their toxic organic payload. The other is that these tiny particles are much more likely to grab toxins from the atmosphere than the larger ones. That makes them more virulent. And finally, these ultrafine particles travel even further from the site of the fires than the fine particles.

For any given size of particle from a wildfire, toxicity of the particle is related to the type of wood and the conditions under which the burn occurs. A recent study concludes that combustion of eucalyptus, ubiquitous in Australia and common in northern California, results in particulate matter that is more toxic than from pine, peat, and red oak. The measures of toxicity in the study were markers for inflammation in the lungs of mice (1).

Fierce, hot burns produce particles with lower toxicity than smoldering burns. The natural reaction from a property protection standpoint is to douse the flames. However, if left to smolder, the detriment is to public health. The reason is that a smolder is relatively oxygen starved and produces more unburnt organic molecules (mostly from the outermost layers of the limb rich in aromatic compounds). These volatile organics then attach to the carbon particles emitted during combustion as soot, making them more toxic. The same lung toxicity study cited above noted a striking increase in toxicity in the smoldering condition (1).

The Bootleg Fire is the third largest fire in Oregon history. The burn area is close to 400,000 acres. To give that context, large fires are classified by statisticians as greater than 1000 acres of burn. Scientists use the metric of area burned rather than numbers of fires. Privation is certainly proportionate to the area of devastation.

Courtesy US EPA 2019 (Wildland Fire Research Framework 2019-2022)

EPA data show that over the three decades preceding 2018, the total numbers of fires have remained essentially constant. Not surprising, since most are caused by human behavior, which, absent intervention, tends to remain constant. Strikingly, however, over the same period, the area burned has nearly doubled (3). This statistic does not include the devastation of the 2020 fire season. Five of the ten largest fires ever in California were in that year. The largest was the August Complex fire, which burned just over a million acres.

The Bootleg Fire is believed to have been triggered by lightning strikes on dry underbrush. The mammoth August Complex fire in 2020 was believed to have been caused by 38 lightning strikes. Lightning as a cause of wildfires has not been in the top four causes, judged by the area burned (2). This is because lightning is ordinarily associated with a thunderstorm, and even 0.1 inches of rain is sufficient to materially reduce the ignition of underbrush. However, with high ambient temperatures increasingly being experienced, “dry lightning” can be produced. The associated rain evaporates prior to hitting the ground.

Until now the dogma has been that the causes of most wildfires are anthropogenic. In California, the four principal causes of ignition were arson, campfires, equipment (such as chain saws) and falling power lines, when the metric was area burned (2). All being related to human endeavor, amelioration was possible. But if the world is moving towards being hotter, the incidences of dry lightning could go up. Death Valley, CA recorded a high temperature of 130 F on August 16, 2020. This was the highest recorded temperature on the planet. August 16 was also the day of the first dry lightning strike associated with the August Complex fire. I am not suggesting causality, but the facts urgently suggest study*.

The literature does not offer promise for direct intervention in dry lightning caused large area devastation, other than controlled burns to clear out the under-canopy vegetation, which is already practiced to a degree. This stark reality applies not just to the fire prone western US, but also to the rest of the country.

Vikram Rao

* ” Lightning’s striking again” in Lightnin’ Strikes, by Lou Christie (1965), written by Lou Christie and Twyla Herbert


1. Kim YH, Warren SH, Krantz QT, et al. Mutagenicity and lung toxicity of smoldering vs. flaming emissions from various biomass fuels: implications for health effects from wildland fires. Environ Health Perspect. 2018;126(1):2018.

2. Syphard, A. D., & Keeley, J. E. (2015). Location, timing and extent of wildfire vary by cause of ignition. International Journal of Wildland Fire, 24(1), 37.

3. US EPA 2019 (Wildland Fire Research Framework 2019-2022)


May 31, 2021 § 2 Comments

This is the story of a minority investor attempting to influence the direction of ExxonMobil to be more climate conscious while being even more profitable. Engine No. 1 (evoking childhood images of the Little Engine that Could) pulled off the improbable.  With less than 1% shareholding, they persuaded major players such as BlackRock and the California State Teachers Retirement System (a major pension fund) to go along.  Two of their slate of four nominees have been elected and another is possibly on the brink.  Management resistance had been acute; much money was spent in opposition.  This was seen as a defeat for the Chairman and CEO.

This is New York Times front page news.  ExxonMobil’s deteriorating earnings performance certainly helped the insurrection.  In a fireside chat with my economist son @justinrao, I was asked whether this would work.  Another NY Times piece opined it would not, and that 2 in 12 directors would simply not have the votes to accomplish anything.  This is simplistic thinking.  A board is not the US Senate (remember the days when senators were collegial and actually listened to opposing viewpoints; well, those days are gone).  Each member of any public board has a fiduciary duty to the shareholders.  They are on the same team.  Usually, important direction setting is given to a subcommittee to research and report on the matter.  The report is debated, and a board consensus is achieved.  If the new members command the respect of the others, and they are not too radical in their approach, change is possible.  The new members are “independent directors”.  For those not in the know, independent directors are defined as those not having a material association with the company.  Certainly not officers.  But also, not employees of the investment groups.  There are shades of gray, but independent directors are seen as not influenceable, hence the term.

All change will have to meet the conditions of duality of earnings growth and some other softer objective, in this case carbon mitigation.  These are early days in the battle for climate preservation.  Relatively low hanging fruit ought to be available.  Were I one of those new directors, I would request scenarios to be produced.  Scenarios are not predictions, they are more in the form of what-if exercises, and particularly useful when strategizing in an uncertain environment. These would be guides to at least a provisional strategic direction which yields good returns while meeting climate change objectives.  The latter would be seen as likely societal outcomes directing company behavior, not ideological. 

The future will almost certainly entail reduction in oil usage, with the only debate centering on rate of change.  Certainly, both Shell and BP are betting on that outcome.  While this is a carbon mitigation direction, from a board perspective it is a demand signal needing response.  In this example, the response would be to plan on oil production reduction while investing in electricity, which will be the “fuel” displacing oil.  ExxonMobil has stated that entering the renewables arenas is contraindicated because they have no edge in that space.  I agree with this view when it comes to production of solar energy.  Not so much on wind, where they could have an edge in the emerging application of deeper water production, for which floating platforms will be needed.  This last is where they have deep expertise.  California has a sea floor that drops off precipitately.  Floating production is very likely.

Rather than participating directly in the production of renewables, they could innovate in the space of filling a critical gap in renewables: the handling of diurnality and peaks and valleys.  Germany derives 40% of its electricity from renewables.  This is an average figure.  On a given day that number could be 15% or 75%.  A recent solar bid accepted in Los Angeles had a direct solar output price of about 2.3 cents per kWh.  But the battery back up added nearly 2 cents to that.  Enabling renewables requires a storage solution.  As evidenced by the LA figure, basic solar is becoming the low-cost standard.  In my view it is headed to commodity status.  The profit will lie in solving storage.  In that area, companies such as ExxonMobil are well stocked in science and engineering talent.  Production of electrolytic hydrogen during periods of excess is one of the candidates.  So, is ammonia. Both are staples of ExxonMobil downstream operations.  They could do this more profitably than most.  

My favorite renewable for oil companies to consider is geothermal energy.  It is fast reaching feasibility at scale.  It is also the only renewable of which I am aware, which is both base load scale and load following.  Load following essentially means tunable to demand.  No storage required.  Most importantly, for oil companies, the core competencies are the same as for oil production.  Furthermore, the personnel laid off due to reduction in oil production could simply be switched to geothermal.

There is profit in renewables, you simply must pick your spots.  The new directors could educate the rest on these points.  As for the CEO, sometimes a win follows a loss.  Sometimes you get thrown into the briar patch*.

Vikram Rao

*How Br’er Rabbit snatched victory from the jaws of defeat, literally the jaws of Br’er Fox. Br’er Rabbit and the Tar Baby, a Georgia folk tale.


April 16, 2021 § 5 Comments

Transportation has bad climate change related PR.  All sectors combined (including aviation) account for about 13% of global CO2 production, whereas just steel and concrete add up to 15%.  Estimates vary, but inescapable is the conclusion that we have not given steel and concrete the attention that we have heaped on transportation to mitigate CO2 production. To exacerbate matters, the world is on an infrastructure expansion spree, including more recently the Biden administration in the US.  More infrastructure equates to more concrete and steel. That is more CO2 emissions.  Unless we do something about it as we have with electric vehicles and hybrid vehicles.

Mitigating CO2 emissions from concrete and steel is more straightforward than from vehicles because they are what we refer to as point sources.  Vehicle tailpipes are distributed, making capture, and disposition of the CO2, prohibitively difficult.  Technically doable with pressure swing adsorption methods, but logistically tricky in release to regenerate the adsorbent and subsequent handling of the CO2.  A decent analogy is NOx capture with urea, requiring canister replacement, a nuisance to many consumers. This difficulty led to alternative non-intrusive means such as the Lean NOx Trap, with the attendant VW deception

First a bit of a primer on iron and steel making.  Iron ore is largely iron oxide and must be reduced to iron.  This is accomplished primarily in blast furnaces, which are shaft furnaces where the reactants are fed at the top and the metal is taken out of the bottom.  The iron oxides are reduced by gases produced from coke, which is a derivative of coal.  The reaction products include iron and CO2.  The iron is then converted to steel by reducing the carbon content and by addition of other alloying elements for properties such as strength and corrosion resistance.  Each metric ton (tonne) of steel produces a staggering 1.8 tonnes of CO2.

The Direct Reduction Iron (DRI) process is a means for reducing the carbon footprint.  The process temperatures are low, and the iron never in a molten state.  The reducing agent is syngas, a mixture of CO and H2.  The combination reduces the emissions to 0.6 tonnes CO2 per tonne steel.  In a variant, hydrogen alone is the reducing agent, and in a further green variant, the hydrogen is from renewable sources such as electrolysis of water using renewable electricity.  However, unlike in the blast furnace process, there is no mechanism for removal of impurities in the ore.  Consequently, only high-grade iron ore is tolerated, and this limits DRI to about 7% of the total market because such ore is in relatively short supply and much more costly.

The most promising route to the greening of steel is through CO2 capture at the blast furnace.  Unlike flue gases from a power plant, blast furnace flue gas is concentrated, typically 30% CO2.  As a result, removal processes are more effective.  Today we are on the brink of capture costs below USD 40 per tonne CO2.  Carbon credits may be purchased in Europe for about USD 55 per tonne.  A recent New York Times story suggests that this will keep rising, with one analyst predicting prices above USD 150.  If a major CO2 producer such as steel or cement is forced to buy credits, the price is certain to go up.  When the capture cost is below the price for credits, the industry has an incentive to simply collect the gas.  However, merely capturing accomplishes little if the gas is not permanently sequestered in what are known as sinks. 

One such sink is subsurface storage in oil and gas reservoirs depleted of the original fluid, or in saline aquifers. While feasible, often with costs lowered by using abandoned wells, debate centers on permanence of the storage and the risk of induced seismicity (earthquakes).  A variant with an important distinction is injection into reactive minerals such as basalt, with the formation of a non-water-soluble carbonate, which certainly is permanent.  However, these wells are more costly because existing abandoned wells are unlikely to be in locations with suitable mineralogy.  The exception to that would be abandoned geothermal wells, which could be proximal to igneous rock from the basalt family.  However, there are not too many of those, and they are geographically constrained.

Mineralization as a genre is being pursued vigorously, with systems already commercial, although the tonnage being sequestered is still low.  Done on the surface in reactors, the resulting carbonate of Na, Ca or Mg can have uses.  Monetization even at small profit still renders the capture cost effective.  Since, in my opinion, capture costs are heading in the right direction, and already at acceptable numbers, the focus ought to shift to sinks with scalability.  Scalability is usefully defined as an aspirational goal of 0.5 gigatonnes CO2 per year by 2040.  But goals short of that are fine if several approaches are proven viable. 

Endeavors to achieve these goals could be materially assisted by appropriate policy action by the various federal governments.  All forms of renewable energy have received subsidies or loan guarantees at some stage in their development.  This has resulted in wind and solar being an established part of the electricity portfolio.  Similarly, electric vehicles have received subsidy support.  The greening of steel and cement ought to receive the same attention.  For example, the Biden administration’s infrastructure bill ought to include provisions for preferential purchase of green steel and cement, at premium pricing. 

Technology is approaching a tipping point for serious inroads into making steel and concrete green *.  Public policy must keep pace.

*For the times they are a changin’ from “The Times They Are a-Changin’” performed and written by Bob Dylan, 1964

Vikram Rao

April 16, 2021


February 26, 2021 § 4 Comments

The title is adapted without a modicum of shame or contrition from a Paul Krugman New York Times opinion piece.  While you will see my engineer’s spin on all this market economics stuff in this piece, his linked opinion is a must read.  At least take in the brilliant title: Et Tu, Ted? Why Deregulation Failed.  The Julius Caesar/Brutus reference aside (inspired, I imagine, by the Ted statement undercutting a principal tenet of Texas energy policy), it is a clear exposition of when free markets fail.  From a Nobel Laureate in Economics, no less.  The Ted mentioned is of course Senator Ted Cruz, whom Krugman in droll fashion describes as R-Cancun.  The press being woefully bereft of scandalous behavior by elected officials (only so much ink you can give to that Congresswoman) is giving the full treatment to Cruz, his wife Heidi (managing director at Goldman Sachs), St. Johns High School (elite private school in Houston) and group chat.

As you know from my previous take on the Great Texas Freeze, Texas has an independent power grid.  It is run by ERCOT, a non-profit agency regulated by the state.  By not crossing state lines, it avoids being regulated by the Federal Energy Regulatory Commission.  This independence from the feds has been fiercely defended by many in current and former elected office.  The regulations, such as they are, were reputedly fashioned largely on concepts laid out by Harvard professor William Hogan. They allow wholesale prices to go up to a maximum of USD 9 per kilowatt hour (kWh).  The average consumer price in Texas normally is around 12 cents per kWh, and closer to 9 cents in winter.  In other words, regulations allow householders to be charged up to 100 times more than normal (wholesale and consumer prices are not directly comparable, but the multiplier is still huge).  Avocados (Krugman’s favorite for the argument I am about to make) doubled in price during a shortage in 2019.  Consternation in Mexican restaurants (and the holy guacamole jokes) aside, consumers could simply eschew the green fruit (yes, it is a fruit, as is the tomato, despite an 1893 Supreme Court decision decreeing it a vegetable!).  Lowered demand and some remedy on the supply side restores prices.  The way it is supposed to work.

But electricity access is not like avocados.  When used for heat in winter it is a necessity, not a choice (even a natural gas fueled heating system needs electricity to run the blowers).  A period without guacamole in the diet would scarcely register on the privation scale whereas sustained frigid temperatures could be life threatening.

But the strongest argument for more regulation ensuring supply is that the supply chain is interdependent.  A turbine operator cold hardening the blades in the expectation of profiting from the high prices during a freeze caused shortage may have fuel supply interrupted, vitiating the business strategy.  In the case of natural gas, the primary fuel for electricity in Texas, the cold hardening would simultaneously have to be done by at least the gas producer, the midstream operator (of the pipeline) and the electricity generator.  Just any one of them doing it may not realize the profit uptick while still incurring the cost.

Texas politicians are in no win positions.  They favor free-market methods but must deal with the fallout from crushingly high electricity bills faced by some consumers.  The tweeted statement by Senator Cruz that led to the Et tu Ted opinion by Krugman was: “This is WRONG. No power company should get a windfall because of a natural disaster”.  The problem with that statement is that it flies in the face of the basic de-regulated model that private companies are incentivized to spend the money to harden for these situations in hopes of high profits during those interludes.  On the assumption that Senator Cruz understands the model, his statement appears to be a repudiation of a tenet of Texas energy policy.  Hence the Et tu (meaning you too)wording in the Krugman piece (Julius Caesar utters Et tu Brute in anguish when he sees that his friend and protégé Brutus is one of the assassinating senators).

Many have taken the position that one cannot spend the money to prepare for the infrequent occurrences.  Hard to argue with this in principle.  However, I will give you the example of El Paso, Texas.  After a freeze in 2011 of similar scale as this one, the state studied the matter.  The city of El Paso simply acted.  They had previously divorced themselves from ERCOT and were joined with the Western Interconnection.  With no compulsion to act as the rest of Texas, they spent nearly USD 4 million to harden their grid to -10 F.  They built a dual fired power station, using oil or gas. This time around, in a city with 680,000 residents, fewer than 1000 customers had electricity black outs for greater than 5 minutes. Since rate payers always pick up the tab, what then was the impact on rates?  The average residential rate in El Paso is 11.11 cents per kWh, compared to the average for Texas of 10.98 cents.  A straight up comparison is difficult, but those figures are not very far apart, 1.1% to be exact.

Was the privation suffered in the great Texas freeze an indictment against free-market models of electricity access?  Probably not.  But economists advising the state must conjure up a framework with built in elements of social responsibility and a greater recognition of the inter-dependencies in the supply chain.  The folks least able to withstand extended loss of power and/or costly energy for survival, need a safety net.  Another such devastating disaster, to borrow another Roman analogy, will have state leaders facing charges of fiddling while Texas froze.

Vikram Rao

February 26, 2021


February 18, 2021 § 14 Comments

Texas prides itself on being the energy capital. The capital (as opposed to the Capitol of the infamous January 6 insurrection) is under siege.  Nature is asserting its might. Unpreparedness sure helps. 

Few know that Texas has its own grid.  The country is divided into three grids: The Eastern Interconnection, the Western Interconnection, and drum roll here, Texas.  Conspiracy theorists may connect this to secessionist tendencies.  Certainly, recent utterances attributed to the former governor Rick Perry don’t help.  He is quoted as saying, “Texans would be without electricity for longer than three days to keep the federal government out of their business,”.  He is referring to the fact that because the Texas grid does not conduct interstate commerce, it is not governed by the rules of the Federal Energy Regulatory Commission.  This from a guy who just a month ago held federal office as the US Secretary of Energy.  

In a Fox channel interview Governor Abbott of Texas blamed solar and wind for the problem.  Small problem: solar is just 1 – 3% of the total and wind is around 20%.  Then his own folks at ERCOT, which stands for Electric Reliability Council of Texas (the reliability in the name is ironic) said it was primarily due to natural gas supply drop.  This makes more sense because gas generators comprise 47% of the electricity produced.  Abbott later walked back the claims and said he meant that renewables could not be the dominant source.  Tell that to Germany, which gets 40% from renewables.  Then Congresswoman AOC trolled Abbott by Tweeting that Texas electricity was 80 – 90% from fossil fuel.  That is not accurate either (coal plus gas come in at about 65%, according to ERCOT).  Just when you think the election silly season is over, you have politicians using their favorite political points scoring issue whenever there is a remote opening for it.

By all accounts, every source of electricity was hampered by the extreme cold, even the nuclear plants.  But, according to the ERCOT leadership, the biggest culprit was natural gas.  Delivered natural gas nearly halved at the most severe stages due to frozen lines.  We know that methane (the dominant component of natural gas) does not freeze till a frigid -182 C.  So, why are natural gas pipelines (these are the main supply lines, not the little ones going to your house) freezing?

I was not able to find any explanation, so I am going to hazard a hypothesis based on other oilfield knowledge.  Almost all supplies of natural gas will be water wet to some degree.  If films of water form, at pipeline pressures of 800 psi or so, temperatures approaching water freezing can cause hydrate crystals to nucleate on the walls.  Again, with the right conditions, these could grow to plug the line.  This routinely happens in undersea gas pipelines.  Those pipelines have a device known as a “pig” which can be made to traverse the line and mechanically clear out the growing crystals.  The other means is to drizzle in methanol, which lowers the freezing point; basically an antifreeze such as ethylene glycol in your car radiator (which too can be used in this application).

Gas hydrates are large crystals of ice with methane in the interstices.  The overall crystal structure looks like a soccer ball.  Richard Smalley, who co-discovered this structure in carbon (a sixty-atom molecule version), got the Nobel Prize for it, in part because finding a brand-new crystal structure of a common element is rare, and in part because carbon in this form has proven to have compelling value in the form of nano materials.  Gas hydrates in the subsurface were once believed to be the next big thing in natural gas sourcing because they are ubiquitous and, according to the US Geological Survey, the total resource exceeds all other carbonaceous fuels combined.  Some probably still are believers.  In my opinion plentiful shale gas has shot down those dreams.  Gas hydrates are also a neat party trick. Take a block of it in a shallow bowl and the seemingly innocuous ice will light up with a match. 

We can conclude from all that we have seen in Texas that industry, especially a loosely regulated one, operates on probabilities.   ERCOT modeling probably predicted such freezes to be infrequent and more geographically scattered, allowing the management with a minimum of disruption.  Not the way it turned out.  Last year a high proportion of the devastating wildfires in California were known to have been triggered by downed power lines.  A cost-effective solution is yet to be identified.  The Lone Star is not alone after all.

Vikram Rao

February 18, 2021


February 10, 2021 § Leave a comment

A hydrogen economy replacing the current hydrocarbon economy is fanciful at best.  But displacing in chunks, that has legs.  First the basics on terminology.  Energy for our way of life is dominantly produced from fossil fuel, with less than 15% from other sources.  That is the basis for defining the current state of affairs as the hydrocarbon economy.  Hydrocarbons use results in CO2 and other emissions.  Any reasonable expectation of avoiding serious global warming involves one or both of two alternatives: sequester emitted CO2 at source and/or from the air or reduce the use of hydrocarbons.  Hydrogen is seen as the route to the second means, but the hydrogen cavalry is at best only just gearing up.

When the cause of hydrogen was advanced in the past, such as in the use of hydrogen as motive power for electric car motors via fuel cells, I had been bearish.  Then, as now, 95% of hydrogen was produced from natural gas.  The process is known as Steam Methane Reforming (SMR).  The first step is partial oxidation of the CH4 to CO plus H2 and then further reacting the CO with water in the water gas shift reaction to CO2 and H2.  The net result is the production of hydrogen and CO2.  9.3 kg CO2 are emitted per kg H2 produced.

Each kg H2 contains 34 kWh of energy.  A gallon of gasoline also contains about 34 kWh of energy.  Combustion of a gallon of gasoline emits about 9.1 kg CO2.  What then is the allure of hydrogen to power vehicles?  The answer lies in the inherent efficiency advantage of electric motors over internal combustion engines powered by gasoline.  The result is that the same 34 kWh in hydrogen will permit a driving distance of 70 miles for a vehicle that would go 25 -30 miles with the same energy content gallon of gasoline.  The other factor is that hydrogen powered EV will have zero tailpipe emissions, whereas all the 9.1 kg CO2 per gallon of gasoline will be released at the tailpipe.  The CO2 emissions from SMR will be at a location allowing capture and disposition.

Tailpipe capture of CO2 is technically feasible.  RTI International invented a process using adsorbents, with the CO2 periodically desorbed for controlled release into a canister.  Part of the difficulty is in the management of the canister handling and the logistics of CO2 disposition.  This nuisance factor is not unlike the issue of using urea canisters in diesel vehicles as a NOx capture means.  The nuisance and expense of that option led to the infamous VW avoidance scheme. In any case, the technology has not been adopted to date.

Traction in the use of hydrogen to displace fossil fuels has been found in the use of hydrogen as a storage means in support of renewable electricity from wind and the sun.  Both these forms do not match usage load profiles.  Germany, which has a renewable component of 40%, has days on which it is as high as 75% and as low as 15%.  This variability dictates the need for storage mechanisms.  Batteries are an expensive solution.  Hydrogen is increasingly seen as the medium of choice.  It may be produced by electrolysis of water using essentially waste electricity during the low load periods.  The hydrogen could then be stored for combustion for power during the high load episodes.  It could also be converted to ammonia, utilizing nitrogen from the air.  Ammonia is more energetically dense, and this is a good option if transport is required.  At the receiving end it could be used for a purpose such as fertilizer manufacture, burned for power or cracked to hydrogen (and nitrogen, which is released to the atmosphere) for an industrial use such as hydrogenation of vegetable oil.

Europe is experimenting with putting hydrogen into natural gas pipelines.  Hydrogen is a very small molecule and difficult to contain from diffusing away.  It also can embrittle steel pipelines.  But at 20% dilution of natural gas, the corrosion is tolerable, and the resulting gas mixture is suitable for all purposes designed for plain natural gas.  The displacement of 20% natural gas constitutes a nibble at the hydrocarbon economy.  The energy giant Engie has piloted this in a town in France.

For a colorless, odorless gas, hydrogen certainly has a lot of color in its classification.  Normal hydrogen from fossil fuel sources is considered gray.  Electrolytic hydrogen using renewable electricity would be considered green.  Hydrogen from natural gas fed SMR would be considered blue, if the CO2 associated were to be sequestered.  Hydrogen may be produced from biogas, much as it is from natural gas.  If the resulting CO2 is firmly sequestered, the hydrogen is considered green. 

Hydrogen will continue to chip away at fossil fuel use.  But a wholesale shift to a hydrogen economy remains aspirational *.  An interesting wild card is the possibility of copious geologically sourced natural hydrogen.  But even this will likely not produce a winning hand.

For more on the topic:

Vikram Rao

February 9, 2021

*Do you believe in magic by The Lovin’ Spoonful (1965), written by John Sebastian


January 12, 2021 § Leave a comment

January 6 will go down in infamy for the obvious, an extraordinary and unprecedented assault on the will of the people.  But it was also a failed gambit.  The failure of two other gambits was revealed on that day as well.  Neither was in the same league, and with luck none ever will be.  But they are certainly worthy of note because they are windows to human nature in the election season.  In any other year, one of them would have been headline news. Until the storming of the Capitol, I used to refer to all that preceded every January certification day as the silly season.  Levity does not wear well with the main event of that day.  But the under card (minor apologies for the pugilistic analogy) events can tolerate a lighter touch.

The New York Times headline on the front page of the January 7 issue assigns culpability for fomenting the mob.  Here we do not do that, sticking with a long-standing practice of this blog site to attempt to steer clear of positions in political matters.  We merely discuss the gambit, no matter who instigated it, which appears to have been storming the Capitol with intent to somehow overturn the results of the presidential election.  Whether this was to be with the simple imposition of will or the actual destruction of the ballot boxes containing the electoral college votes, we may never discern.  This had all the hallmarks of rabble rousing, as opposed to a carefully planned and executed campaign.  Having said that, history is pocked with instances of rabble rousing that took on lives of their own, with more lasting effects.

The other gambit was played in Georgia in early November and the result was revealed on January 6.  This was a decision to hang onto President Trump’s coattails despite his having lost the election in that state. Both Senate incumbents decided to do just that, although Kelly Loeffler was more strident, pointing out her perfect record of voting with the President. Many others across the country had ridden that strategy to victory.  In fact, in something of an anomaly, the coattails grabbers performed better than the coat wearer.  But both Loeffler and Perdue did not reckon with the President litigating (both literally and figuratively) his election loss, by basically asserting that the outcome could be reversed.  This took away the best argument for voting for Loeffler and Perdue, which was to preserve the Republican majority in a senate with Biden as president, an essential means for tempering the new president’s agenda.  As the weeks wore on, he attacked the governor and secretary state of Georgia, both Republican, for somehow being responsible for his loss.  The final straw was when he demanded support from both incumbent senators for a challenge of the electoral college votes in the January 6 certification in the joint houses of Congress.  With the die already largely cast, they went along with the political version of a Hail Mary (touchdown) pass. Even that characterization is charitable given any reasonable interpretation of the Constitution.  When the dust settled and the Georgia votes were counted on January 6, the same fateful day of the mob attack on the Capitol, both incumbents had been defeated in what most considered to be an upset verdict.  The margins of victory were even wider than the 0.5% which could have enabled a recount.

The third gambit pales in comparison and deserves discussion principally because of the temporal coincidence of it playing out on January 6.  But the gambit was conceived and announced in early 2017, shortly after Trump was elected president. It was also shortly after President Obama, literally on his way out of the door, issued an order to “permanently” ban offshore Arctic oil and gas lease sales.  My opinion of that play is in a 2017 blog. For President Trump this was an attempt at showing support for the oil and gas industry.  The only problem was that the smart money has known, at least since the plummet in oil prices in 2015, that the Arctic was too risky.  On pure economics, leave alone considering the environmental impact.  I opined a few months ago that the interest in the leases would be very low. But the gambit had to be played out.  Possibly nobody told the President that this could prove embarrassing; is it still a party if (almost) nobody comes?

And it was embarrassing indeed when the sealed bids were opened on that otherwise fateful January 6.  Only 11 of the 22 tracts offered had bids.  None of these were from super majors, majors, or large independents. All were at or near the minimum allowed in the auction.  9 of 11 were won by a division of the state of Alaska, not an oil drilling and production entity.  The other two were very small independent oil companies.  All this in the name of supporting the oil and gas industry.  If only they had been consulted.

Vikram Rao

January 12, 2021


December 6, 2020 § 1 Comment

Outgoing Presidents are making a habit of throwing long passes with the Arctic football.  And like all long passes, the probability of being caught is low.  But they differ in character.  When President Obama issued an executive order about this time in 2016, it was to “permanently” ban future lease sales in US Arctic waters.  At the time I wrote that this was purely symbolic, with no real impact, because even if leases were offered for sale, this would be a party to which hardly anybody would come.  When President Trump assumed office, one of his first declarations was the intent to sell leases in the Alaska National Wildlife Refuge (ANWR).  While this was not a challenge to the President Obama executive order per se, ANWR being on a coastal plain and not offshore, it too was symbolic, as demonstrated by the fact that nothing happened for the rest of his term.  Until now.

The Bureau of Land Management (BLM) is rushing through the process of posting tracts for leasing*.  The timeline they are following is much shorter than normal, reportedly to receive the bids prior to the presidential handover.  A rushed process will not properly match up tracts with buyer priorities.  This alone does not auger well for a heavily subscribed sale.  But the primary reason for tepid interest is the market.  Oil prices are low and uncertain.  Gas prices are low and certain. This must be about oil alone.  Anybody buying ANWR leases in gas prone areas would have a whole lot of explaining to do.

The 2017 Bureau of Ocean Management lease sale in the Gulf of Mexico was strong.  Oil prices were low then and nearly as uncertain as now, not counting the Covid-19 induced depression.  An important measure of oil company stock value is reserves replacement.  Loosely, it is the ratio of new reserves booked to oil produced.  So, they are always looking to find new reserves.  But the Gulf of Mexico is well understood, with considerable seismic data and production experience.  In comparison, the Arctic has a paucity of seismic data.  And the results to date have been daunting.  Shell walked away from the Arctic after spending a reported USD 7 billion, with not much to show for it.  Admittedly, that was offshore in the Chukchi Sea, and ANWR would be different.  But the hurdles will remain numerous.  Environmental sensitivity and wells made expensive in part by the limited window of time for operations are just two. Financing will be hard to come by and legal challenges are almost certain to at least delay operations. This leaves just the very large oil companies with strong balance sheets and equally strong stomachs.  The European headquartered ones, especially BP and Royal Dutch Shell, are very unlikely to plan an Arctic venture.

The need to increase reserves could be addressed in an unconventional manner: increased focus on tertiary recovery of oil.  Simply put, conventional means usually can extract only about a third of the oil in place in the reservoir.  Tertiary recovery utilizes CO2to wring more out.  The gas is introduced into the formation in a supercritical state.  Supercritical CO2 enters pores effectively as if in a gas state but reacts with the oil as if in a liquid state. Being twice as soluble in oil as in water causes preferential combination with oil.  This mixture is pushed to the producing well due to high pressure in the injection well.  The CO2is separated and re-injected.  This process is repeated and each time more of the CO2 remains in the ground.  Eventually, about 95% of the CO2 is retained in the formation and subject to the same trap mechanisms that kept the original hydrocarbons in place.

Some estimates have that by this method 50 billion barrels could be recoverable in the US at a USD 70 price per barrel of oil.  By comparison, ANWR is estimated to hold 15 billion barrels, and I doubt those are economical at USD 70 oil price.  If the CO2 were to be priced at USD 50 per tonne, the amount needed would roughly cost USD 11 per barrel of oil recovered.  Since the finding, development and much of the lifting cost has already been incurred, this ought to be affordable at USD 70 oil.  Part of the current 45Q Federal Tax Credit, which is USD 35 per tonne CO2 for such sequestration, would also be available.  Since this is paid to the CO2 producer, who, together with the USD 50 paid by the oil company, would have USD 85 available for capture.  This is right in the range of feasibility of the latest capture technology.

A recent life cycle study concludes that a barrel of oil from tertiary recovery represents a 37% CO2 emissions reduction compared to oil from conventional production (see reference below), provided the CO2 is sourced industrially.  Arctic operations will, of course, have an additional layer of environmental risk associated with their location. 

In some senses, here then is the choice for oil company strategists.  On the one hand ANWR, costly and environmentally risky due to the location.  Data paucity adds a layer of exploration and development risk.  Uncertainty in the price of oil renders profitability in doubt.  On the other hand, tertiary recovery using industrially sourced CO2.  Negligible operational risk and you already own the lease.  At scale it makes a serious dent in industrial CO2 emissions, the marginal costs are low per barrel of oil produced, and the surface environmental footprint is much smaller compared to new areas explored and produced. 

Reason dictates that the Arctic ought to remain icy cold.

*Dream on in “Dream On” by Aerosmith (1973) written by Steven Tyler

Vikram Rao

December 6, 2020

Núñez-López V and Moskal E (2019) Potential of CO2-EOR for Near-Term Decarbonization. Front. Clim. 1:5. doi: 10.3389/fclim.2019.00005  


November 30, 2020 § 2 Comments

A recent New York Time story discusses an EPA report of diesel pickup truck owners disabling emission controls on their vehicles.  The scale of the deception is reported as being the emissions equivalent of putting nine million extra trucks on the road.  There appears to be a cottage industry of small companies devising schemes to manipulate the emission controls systems.  The term cottage industry likely does not do it justice when one realizes that, according to the EPA, since 2009 more than half a million trucks have been involved.  Yet, each of the providers is usually a small player.  In 2019 the EPA acted against 48 companies, bringing their decadal total to 248.

The motivation for these actions is not to despoil the environment.  It is to improve fuel efficiency and/or engine performance such as (higher) torque.  Diesel vehicles are singularly (dis)credited with high production of NOx and particulate matter (compared to gasoline vehicles).  In the case of particulate matter (PM), this comparison is strictly valid only when the measure is micrograms per cubic meter and not particle count.  But that is a hair to be split another time.  Mass based measurement is the only regulatory approach at present.

PM from diesel engines is captured in Diesel Particulate Filters (DPF).  After a couple of hundred kilometers of operation, the filter starts clogging.  This causes back pressure on the engine and reduces engine performance.  Engine manufacturers set back pressure limits at which the filter must be regenerated.  This is accomplished by running the engine hot, which produces more NO2, which catalytically oxidizes the carbon deposits.  But if the engine duty cycle simply does not achieve these temperatures, the filter will remain clogged.  Sometime after, it will cease to function and will need to be replaced. This is a clear example where circumventing the DPF benefits the consumer in performance and avoidance of the cost of premature replacement of the DPF.  Enter small player who replaces the DPF with a straight pipe for a fee.

NOx is short for some combination of NO2 and NO.  It is harmful to human health directly, and indirectly through ozone formation.  Ozone is formed when NOx reacts with organic molecules and sunlight in the atmosphere.  Ozone, often erroneously equated to smog, is a component of smog and injurious to health, with children especially vulnerable.  NOx is produced when nitrogen in the combustion air is oxidized.  Higher engine temperatures create more NOx.  The high temperatures needed to regenerate DPFs have this undesirable effect.  This is an example of the balancing act required in emission controls.  One remedy for this particular trade-off, present in some automobiles, is an auxiliary heating element in the DPF, which fires up at appropriate intervals.

NOx control is addressed in one of two ways.  The more foolproof one is Selective Catalytic Reduction (SCR), in which urea is injected into the exhaust.  Urea decomposes into ammonia (in two possible steps), which catalytically reacts with NOx to reduce it to nitrogen.  Engine performance is unaffected.  But the urea canister must be replaced periodically, which consumers see as a nuisance.  The system occupies space, so this tends to be in bigger engines (over 3 L displacement).

The alternative, more targeted to smaller engines, is the Lean NOx Trap.  Here, the NOx is captured by adsorption onto a surface.  Desorption (removal from the surface) is achieved by chemical reduction by exhaust gas components such as uncombusted diesel, carbon monoxide and hydrogen.  In an efficiently running engine, these will, by design, be in short supply.  Accordingly, the control system deliberately causes the engine to run rich (oxygen starved) to produce these reducing gases.  While this achieves the purpose of reactivating the NOx capture, during these intervals, the engine runs with lower power and is fuel inefficient.  The fuel efficiency reduction overall is in the vicinity of 3 – 5 %.  As a frame of reference, this is the reduction gasoline engines see by the addition of 10 – 15% ethanol in the gasoline, and there is no consumer push back, perhaps because it is not accompanied by a performance reduction.  Yet, in diesel pickup trucks, it appears some combination of that and loss of torque is responsible for the public buying avoidance schemes.

Avoidance schemes are known in the industry as “defeat devices”.  The practice is so rife, it has a name.  Vehicle makers have in the past cut these corners, the most infamous being the VW cheating episode.  But this piece is about aftermarket interventions, which are harder to corral, in part due to the sheer number of companies involved.  This proliferation is due to the ease of devising the defeat devices.  While every state has somewhat different methods for emissions testing, a common one is to simply query the onboard computer for clues on functioning of the emissions system.  Altering the onboard programs is enough to produce the avoidance.  In other cases, the emissions are tested directly while on a chassis dynamometer (think stationary bike).  With such testing, the avoidance must be more sophisticated, as it was in the VW case. Then there is the simple substitution of the SCR device with a tube.  A mechanic with a reasonable modicum of competence could execute this.

The defeat devices cottage industry would not exist were there not a market for the product.  The pickup truck owners simply want the performance* and may well consider the infraction to be minor.  Curiously, they would not be too wrong on the DPF avoidance.  The regeneration step described above is combustion of the filter cake and does produce carbon particles.  However, the general public is unlikely to be aware of the intricacies of operation of the DPF and the impressions are probably grounded in the beliefs of the truck owner community. 

Although more than likely aware of the illegality, truck owners are probably aware that individuals are never targeted by the federal government.  In many ways, this is more pain-free than a speeding infraction.

*I can’t get no satisfaction from “(I can’t get no) Satisfaction” by the Rolling Stones (1965), written by M. Jagger and K. Richards

Vikram Rao

November 30, 2020

Where Am I?

You are currently browsing the Uncategorized category at Research Triangle Energy Consortium.

%d bloggers like this: