January 2, 2019 § 1 Comment

A recent story notes that natural gas drilling in 2018 has dropped by 87.7 %, from a peak in 2008.  Over the same period, natural gas production has increased by 58%.  Natural gas drilling is down to a whimper, but natural gas production continues to grow, year on year.  Had the gas production been from conventional offshore reservoirs, one could have hypothesized that a few large gas fields dominated production, despite fewer wells being drilled.  But most of the drilling for natural gas in this decadal period has been in shale, which does not produce high volumes, but each well is relatively inexpensive.  Before we launch into the explanation of the seeming anomaly, consider the impact of the result.

Natural gas production, largely from shale, was arguably the single biggest reason for lifting the US out of the last recession.  In the decade prior to the recession, US gas prices had fluctuated wildly from USD 2 per million BTU (MM BTU) to as much as USD 15 per MM BTU.  Nothing dampens the spirit of investors in capital driven industries more than unpredictability in the price of the key raw material.  Consequently, major industries, methanol producers for one, fled to countries with sustained low gas prices, such as Trinidad.  When shale gas went on the market in high volume, prices dropped, and stayed low, in the vicinity of USD 3 per MM BTU.  With predictions of sustained low prices, predictions which have held up now eight years later, industry returned to the US.  Liquified natural gas (LNG) imports were no longer necessary, and shortly thereafter, the US became an exporter of LNG.  For every citizen in the US, a lower fraction (sizable for many) of take-home pay went towards transportation and home heating and cooling.  The savings were spent on goods and services.  The recession was in retreat.

Shale oil picked up and became a major force by about 2013.  In 2015, the high production halved the world oil price and OPEC was marginalized.  The low oil price, together with the low natural gas price, contributed to the economic gains and a record stock market.  But gas prices stayed low despite steep reduction in gas exploitation, because gas supply continued to be high.  Curiously, and seemingly paradoxically, the reason is the steeply increasing oil production over the decade.  Over roughly the same period as the decline in gas drilling, oil production has increased from 5.0 MM bpd in 2008 to 11.6 MM bpd in 2018.  Now for the explanation as to why that caused gas production to rise.

Crude oil comprises of a mixture of molecules, with the bulk of them conforming to the formula CnH2n+2, where n is an integer.  Oil molecules break down over time in the host environment of high pressure and temperature.  The most thermally mature state is methane, with n=1.  Ethane, propane and butane, with n=2,3 and 4, respectively are the next more immature.  Shale oil is very light, as defined by API gravity.  Accordingly, the n’s are low numbers relative to heavier oils.  One could reasonably expect shale oil to be associated with some molecules at higher thermal maturities.  This is known as associated gas, and usually comprises methane in the main, together with the somewhat larger molecules with n=2-4 and more. Canadian heavy oil, on the other hand, could be expected to have little or no associated gas.  More shale oil production automatically means more shale gas production.

Recent data from the Permian, the hottest oil play in the US today, indicates that every MM Bpd of oil would have associated with it 2.2 billion cubic feet per day (bcfd) of gas.  If this statistic is taken to apply to all shale oil, as a first approximation, on would expect gas production to be 14.5 bcfd greater in 2018 than in 2008, from this source alone.  That translates into 5.3 tcf per year.  With no let up in shale oil production in sight, natural gas will continue to be produced.  Expect, therefore, for natural gas prices to remain at low to moderate levels, and a boon to the economy.  Shale gas drilling is, metaphorically speaking, dead, or at least a shadow of its formal self.  But natural gas remains the reigning monarch in assuring a healthy economy.

Vikram Rao


August 21, 2018 § Leave a comment

The US administration appears to have fired a salvo against carbon mitigation.  We will examine the facts and muse on the likely true impact of the government action.  A recent story  discusses the implications of a directive seemingly buried in a memorandum on fuel economy standards from a month ago.  I am unable to find the one memorandum, but earlier documents for public comment are clear on a couple of measures, which are quoted in the linked story.  These include, freezing fuel economy standards to 2020 planned levels.

Ironically, this comes at a time when completely unprecedented forest fires rage worldwide.  Since this was more or less predicted by earlier temperature rise models, the increased frequency of fires is believed to have anthropogenic origins, as noted in a recent PNAS paper.  The impact of forest fires is profound.  The economic privation is high as is the long term impact on health.


Health impact of emissions from combustion

Even climate change deniers must accept the epidemiologically supported finding that airborne particulate matter (PM) is responsible for 6.5 million premature deaths, annually, worldwide.  Nearly two thirds of the mortality figure is attributed to wood burning for cooking and heating.  Forest fires are country cousins, in terms of type of particles emitted. They are spectacular and frightening, but in the overall scheme, currently account for minor contributions to airborne particulates.  But, to the extent they are driven by temperature rise, this contribution can only increase, even as other anthropogenic PM diminishes due to interventions.


Over 80% of airborne particulates are sourced from either the creation or use of energy.  On the consumption side, the biggest contributor in the US and Europe urban communities is automotive exhaust.  While diesel gets all the press, gasoline also is a contributor, particularly of ultrafine particles, which are especially toxic.  Both also produce NOx and organics, which are responsible for atmospheric reactions with the particles, often involving ozone and mediated by photochemical action, rendering them more toxic.  There ought to be little dispute that reducing these emissions ought to be a federal objective.  Two measures can accomplish that: engineered mitigation of emissions, such as better diesel particulate filters, or decreased use of fuel.  The objective of using less fuel, while obtaining the same utility, can only be attained with better engine efficiency.

Implications of the new guidelines

One of the directives is the relaxing of fuel economy standards, by holding them constant at 2020 levels.  This runs counter to the point made above, regarding means to reduce the impact of airborne particulate matter.  Since no companion guidelines are provided regarding PM capture, on these grounds alone, the new guidelines are unfriendly to the goal of reduced particulate emissions.  Admittedly, mortality figures associated with PM in the US are small compared to the world figure noted above.  A recent paper estimates it to be 138,000 annually in the early 21st century, about 5.1% of total deaths.  That is more than double those attributed to influenza and pneumonia, for which serious intervention measures exist.

In the linked story, the memorandum is quoted as stating that growth of natural gas and other alternatives to petroleum have reduced the need for imported oil, which “in turn affects the need of the nation to conserve energy.”  The gist of the story is that the administration believes that conserving oil is no longer in the national interest.  Without doubt the proliferation of cheap shale gas has allowed many commodities to be made profitably from natural gas, instead of from oil.  Since the US is a net importer of oil, such use of gas does reduce oil import.

However, due to shale production the US is already the leading producer of oil and gas.  Nevertheless, it still imports oil, while also exporting both fluids.  A lot of this trade is with Canada and Mexico.  I predict that within five years North America will be self-sufficient in oil and gas.  Accordingly, from the standpoint of national energy security, conservation is not needed (provided we don’t emasculate NAFTA in the energy sector).  But the plea in this blog is that it is needed to preserve the health of the citizenry.  And it is not conservation, per se, that we seek.  Simply, the more prudent use of energy for no less gratification.  Drive the same miles, stay just as warm (or cool), but do it more efficiently.

Vikram Rao




August 3, 2018 § Leave a comment

In matters concerning airborne particulates, size certainly matters.  But in this realm smaller is more powerful, not necessarily in a good way.  In fact, decidedly not in a good way when it comes to toxicity.  In every step along the way from inception of the particle nucleus in the combustion process, to growth and transport, and finally to action on human organs, size plays a critical part.  And the smallest particles, less than 0.1 µm in size, are the most effective in each of those reactions.  The somewhat larger particles, up to an order of magnitude larger, are no slouches either, when it comes to toxicity to humans.  But the nanoparticles are the star act in a dark play.

Over 80% of anthropogenic airborne particulate matter (PM) results from either the production or use of energy.  The three principal players are coal combustion, automobiles and wood burning for cooking and heat.  The mechanism of PM production is common to all three.  It begins with the formation of a nucleus, followed by growth and completed by reactions with atmospheric constituents such as ozone, and mediated by photochemical action.

Particulate matter can range in size from a few nanometers to a hundred micrometers.  That covers four orders of magnitude.  The range is another few orders of magnitude greater when one considers surface area of the particles, instead of diameter.  Surface area is the key parameter in the chemical reactions affecting particles.  In chemical processes, for example, the best catalysts have high surface area to volume ratios.  Not surprisingly, size does matter in the impact on human endeavor.  Except, in the PM world, the littlest guys kick serious sand into the eyes of the large ones.

Diesel exhaust particles and mass distribution

Source: retrieved July 31, 2018

In considering the impact of ultrafine particles, one first needs to examine their prevalence.  The figure shows a typical distribution of particles in diesel exhaust.  On a mass basis, the majority are in the fine range (0.1 < x < 2.5 µm).  But, on a particle count basis, the majority are in the ultrafine range (x < 0.1 µm).  The mass associated with the bulk of the particles is a small fraction of the total mass emitted.  This is understandable, because it takes thousands of ultrafine particles to equal the mass of a single fine particle.  But, it calls into question regulations using mass as the variable.

This might appear to be splitting the proverbial hair.  But recognize that amelioration schemes rely on the regulatory variable.  With mass as the yardstick, a low mass count could simply be the result of removal of the coarse and fine particles.  This would leave the bulk of the particle count still in the medium, and if they are in fact the true bad actors, not much will have been achieved with the filtering.  All the foregoing argument applies only to health effects.  Visibility and climate change related parameters, such as radiative forcing, may be affected by the larger size PM.  Certainly, the recently reported reduction in solar panel efficiency by PM, will be more impacted by the larger particles, because the ultrafine PM is less likely to settle on to the panels.

Since the mass related regulation may continue to have utility, an additional parameter may be prudent to consider.  Two candidates are surface area and particle count.  Surface area to volume ratio increases dramatically with reduced size.  In a sense, just particle count by size will do the job.  But, in health-related outcomes, both size and surface area are separately in play.  Size determines the degree to which the particles enter and impact various organ systems.  Surface area likely mostly impacts from the standpoint that high surface area particles are scavengers for toxic substances such as volatile organic chemicals.  They are also more likely to have highly toxic reactive oxygen species on their surface.  From a public understanding and acceptance standpoint, the particle count may be simpler to communicate.  Multiple measuring devices are currently available to perform the count.

In summary, ultrafine particles play exceptional roles in the health impact of particulate matter.  Regulatory focus ought to shift to address this fact, to better inform intervention schemes.

Vikram Rao

*with apologies to Oscar Wilde


November 6, 2017 § Leave a comment

Energy resiliency, especially in relatively isolated communities, can largely be achieved by local production and distribution.  In the mid to low latitudes, solar intensity will favor solar electricity.  And it is getting cheaper by the day.  A recent winning tender for utility scale solar in India came in at around 3.6 cents per kWh.  That is cheaper than many coal plants, certainly any newly constructed ones.

To really take advantage of solar electricity, note the fact that the electricity is output as DC.  Conversion to AC, transmission, and then reconverting to DC at each device such as LED lights, computers and cell phones, is wasteful.  Furthermore, useful equipment, such as fans and compressors, use less electricity for the same output, when running on DC.  A DC powered “brushless motor” fan consumes between 40% and 70% less energy than one running on AC.  Compressors are the workhorse of those two other common household appliances: refrigerators and air conditioners.  However, these last two are currently not mass manufactured in DC use mode; they ought to be.  Curiously, the latest refrigerators do use DC in the critical components, even though the input power is AC.  DC fans are well on their way in India; a trade partnership could have them delivered here.  For rural communities, DC powered well pumps exist, with dozens of manufacturers in India.

A possible architecture in a community could have the following features:

  • Small solar farms attached to each development, commercial or residential. Since people love trees around the homes (especially in low to mid latitudes), and solar panels prefer absence thereof, rooftop solar is contraindicated.  Furthermore, on-ground solar is lower cost to install and maintain, and can take advantage of tracking of the sun.
  • DC microgrids to conduct the power to the users. For long distance transmission, AC is preferred.  That is pretty much why Edison lost out to Westinghouse about a century ago.  But for the short distance of a microgrid, DC works just fine.  Preferably, the homes and establishments ought to be wired for both DC and AC, as are data centers today.  The DC wiring would feed the current DC devices.  Eventually, homes ought to convert to all DC devices.  In the meantime, the AC portion would be fed from one single DC/AC converter at each home junction box, at relatively high efficiency.  All this is compatible with grid power, which should increasingly be deemphasized.  Again, in this instance we are discussing moderately or totally remote communities.  A military base would qualify as well, for additional reasons of energy security.
  • Community waste to biogas is simple to execute (landfills, animal waste or water treatment plants). The biogas can be used as fuel for many purposes, but also for generators with DC output.

In short, solar electricity, combined with a DC microgrid could serve the purpose of resiliency.  At the same time, the proper use of the attributes of DC power could also cause less energy to be used for the same utility.  This checks both the resiliency and energy efficiency boxes.  Resiliency may be viewed as a measure to adapt to climate change.  This approach, to a degree, simultaneously addresses mitigation.


Vikram Rao


September 8, 2017 § 1 Comment

A recent story discusses the impact of Hurricane Harvey on the availability of some common plastics.  It points out that the hurricane has shut down production on the Gulf Coast sufficiently to impact availability of these materials well into November.  They refer to derivatives of ethylene, in particular, polyethylene and PVC.

Hurricane Hugo Slams Into Puerto Rico

We have previously discussed in this forum, and in my 2015 book, the concentration of ethylene crackers in the Gulf area.  The main point made then was the distance of the crackers from many of the ethane sources associated with shale gas.  This distance has caused ethane pricing to be extremely low in consideration of its calorific value. In the book, I note that LyondellBasell grew substantially because they owned two crackers in the Midwest, and profited handsomely from the low local prices.  More recently, ethane from Texas sources has fed plant expansion in existing plants near Houston.  These are barely on stream.  Then Harvey hit and shut many of these down.  Incidentally, gasoline and diesel production also was impacted.  This is evidenced by (Arab Embargo caused) 1970’s style lines at gas stations in Dallas.

The impact of Harvey on ethylene production underlines the risk associated with large concentrations of oil and gas refining, or any chemical industry for that matter, in storm prone areas.  Distributed production of fuels and chemicals is a good idea for a variety of reasons.  One is exemplified in the Harvey ethylene and gasoline situation.  Another, more germane, is the location of conversion plants close to the raw material source.  In the limit, pipelines are eliminated.  Today, shale oil from the Permian is being hampered by lack of pipeline capacity.  The spread between WTI and Brent is once again rearing its ugly head.  It was squeezed when oil export was allowed.

The knee-jerk reaction would be to build more pipelines, fast.  The more thoughtful action would be to permit and build small refineries proximal to the production.  Shale oil is light, and mostly sweet (low sulfur).  It can be refined in “simple” refineries; essentially distillation columns.  The complications of cracking are not in play.  Once financed, these can be built in two to three years, not very different from the time scale to enable pipelines.  Fewer pipelines are better for local property owners, and for the environment.  Local jobs will be created, and the prosperity will be distributed.

Shale oil, because it is light, always has associated gas.  Expect a ramp up in gas production, possibly without enough pipeline capacity.  Distributed conversion of this gas into chemicals such as methanol would be an alternative to pipelines.  In some cases, new technology will be required, because small scale production of fuels and chemicals is disadvantaged by absence of economies of scale.  A national network of manufacturing institutes (NNMI), a federal initiative, has one in this space, known as RAPID.  The objective is process intensification, a means by which small scale processes can be economic.

The oil price scenario is playing out now.  Shale oil caused the plummet in oil prices, beginning in late 2014.  That 50% drop has substantially remained, almost three years later, with some ups and downs.  The Saudis gambled on the demise of shale oil if the prices stayed low.  Sure enough, according to the Economist, there were a hundred bankruptcies, and default on USD 70 billion in debt.  But the industry is still alive, and fairly well.  Part of the reason is the entrée of the big players such as ExxonMobil and Shell, into the Permian.  The other reason is innovation to reduce the breakeven cost of production.  Initially, the cost reduction came from service company discounts and operational efficiencies.  Following a thinning out of service companies, those prices will rise.  The key parameter is cost per barrel.  The improvement can come either in reduced cost or increased production.  Expect the latter to be the main player, through innovations increasing the percentage of oil in place recovered.

My crystal ball says that innovation will reduce breakeven costs below USD 40 per barrel and the industry will thrive.  But oil prices will continue to stay low, in the consumer-friendly range USD 40 to 65 per barrel.  If all of this comes to pass, expect US oil production to go up 3 million barrels per day by 2020 or so. That is a good 30% over current production.  Associated gas will flow as well.  Now is the time to challenge the orthodoxy in fuels and chemicals processing.

Vikram Rao



July 31, 2017 § Leave a comment

The proposed Atlantic Coast Pipeline (ACP) will be a deterrent to hydraulic fracturing in North Carolina, if completed.  A recent report in the News and Observer discusses the merits of the ACP.  Importantly, inexpensive natural gas from the north will create a disincentive to produce it in our state.

Shale gas has been singularly responsible for pulling this country out of the recession during the early part of this decade.  The fact that our gas is up to a third in price to elsewhere in the world, has caused a manufacturing renaissance.  That means jobs.  More than half of the over $ 150 billion newly invested capital in chemical manufacturing has come from foreign companies, who are unable to compete by producing in their home countries.  However, we here in North Carolina, have not seen the impact of that investment.  In large part that is because we lack the natural gas.  We could get it from production in our state, or we could have it piped down from up north.  That second option is where the ATP comes in.  It is certainly the better of the two options.

Could we safely produce shale gas in NC?  In my view, yes, provided our state rules and regulations are followed.  But, ought we to do so?  I think not, based on a couple of factors.  One is that NC deposits are not believed to be highly prospective.  It will be hard to get responsible oil and gas operators interested when better pickings are available elsewhere.  The second is that I expect natural gas prices to remain depressed for several years.  This is bad for producers, but great for consumers.

In the NC portion, the pipeline diameter will be 36 inches, with a capacity of 1.5 billion cubic feet (bcf) per day.  That is roughly how much a single liquefied natural gas (LNG) facility would use.  So, clearly, there is no intent to use this as a lever for an LNG plant for export.  This is all to the good, because I don’t consider east facing coastal LNG exports to be a good bet, especially with low oil prices out several years.  The intended purposes are electricity production, consumer use through current distribution schemes, and other uses not yet planned.  This last is enabled by the fact that this is an “open access” pipeline.  The unspoken for capacity is another criticism leveled by some.  On the other hand, it represents an opportunity.

I believe that the state ought to attract capital to build chemical plants using this gas.  A good candidate would be ammonia plants.  We still import (from other states and abroad) all our ammonia fertilizer.  Together with our world class phosphate mine in Aurora, ammonium phosphate production and export could be feasible.  Chemical plants of this sort principally employ two-year degree personnel.  Our nationally acclaimed community college system could feed into that.  And skilled jobs in the eastern part of the state would be welcomed.

The referenced N&O story mentions some push-back.  One critic claims that Marcellus gas is depleting, implying that in a few years the ACP will run below capacity and not meet our needs.  The Marcellus is one of the largest gas fields in the world, and these are early days in the exploitation.  Furthermore, the Utica field may be larger yet, and the access is straightforward because it is directly underlying the Marcellus.  There is considerable clamor to permit more LNG plants for gas export.  Nobody is suggesting there will not be enough gas.  As noted above, each of these LNG plants uses about the same amount of gas as the ACP.  Which would you rather have, gas exports creating jobs elsewhere, or an Atlantic Coast Pipeline creating jobs in North Carolina?

Finally, for those folks who think hydraulic fracturing is a disease we ought not to contract in North Carolina, the ACP is the perfect inoculation.

Vikram Rao


June 27, 2017 § 2 Comments

In a recent story, Secretary Perry offered up the national imperative: “pave the path toward U.S. energy dominance”.  President Trump has also used the rhetoric of dominance.  In most settings, the word stands for a position of control.  In energy, it could mean control of price and unfettered availability for domestic consumption.  Perry especially cited oil, gas and coal, so we will restrict our discussion to the first two and consider where matters stand today, to appreciate possible new directions by the administration to achieve these goals.

First some editorial comment.  Dominance is rarely desirable.  The folks dominated hate you and will extract their pound of flesh somewhere.  Energy independence is the next one down the line.  This too is not desirable, because interdependence, especially with friendlies, is of value.  Besides, as discussed below in the case of oil, it is more economically favorable than independence.  Finally, there is energy security.  This one is a solid yes.  It translates into cost effective energy available when needed by the nation.

Shale gas has created a situation for the US that is not far from dominance, but not by design.  The abundance, enabled by the technology of hydraulic fracturing, has caused the price in the US to be under USD 3 per MM BTU for years, and likely to remain under USD 5 several years out.  The US is now an exporter of Liquefied Natural Gas (LNG) rather than an importer.  This has led to a worldwide drop in gas price.  A price controlling cartel led by Russia failed to materialize solely due to the US shale gas phenomenon.  The US price continues to be half to a third of most places in the world.  Domestic industries relying on natural gas are having a renaissance, compared to their competitors abroad.  Effectively, US shale gas is controlling the world price, despite gas being a regional commodity.  This walks and talks like dominance, albeit unintentional.

Prior to 2015, oil price was controlled by the OPEC cartel to be in the vicinity of USD 100 per barrel.  The influx of shale oil caused the price of oil to plummet to about USD 50 within six months, beginning in late 2014.  It has stayed down there since.  The Saudis are credited with playing the high stakes gambit of not cutting production to prop up the price, with the intent of mortally wounding shale oil in the US.  A hundred bankruptcies and default on USD 70 billion in debt notwithstanding, the industry is strong as ever.  The resiliency was in part due to technical and operational innovation, and in part due to the presence of ready buyers.  ExxonMobil, Shell, Chevron and ConocoPhillips have firm new footprints with an avowed intent of major investment, at the expense of costly forays, such as into the Arctic.  A parenthetical point here is that the Trump reversal of the Obama era Arctic freeze (on new lease sales) has no net effect; the investment will not go there.

Shale oil can turn on (or off) a dime.  A new well takes weeks to come on line, as opposed to years on offshore platforms.  This response time allows shale oil to ride the waves of price fluctuation.  Continued innovation will drop the breakeven well below USD 40 per barrel, and is already there for many prospects.  Expect oil price to remain in the range USD 40 to 65 for years, with minor excursions.  I predicted this very range in my 2015 book, and nothing has changed.  We can safely conclude that shale oil is keeping the price of oil down in the range mentioned, despite OPEC desires.  In effect, therefore, US shale oil is now the determinant of the world oil price.  Not exactly dominance, but certainly a high degree of control.

Where does that leave this administration with respect to assuring “dominance”?  Just don’t mess up a good thing.  President Obama removed the oil export ban in December, 2015; this had been long overdue.  This was crucial for the US energy economy because US light sweet shale oil was not best suited for US refineries.  The discounted heavy oil from Canada, Mexico and Venezuela was preferred.  The lift of the ban allowed export of shale oil, thus removing a domestic glut, and an associated oil price discount of WTI versus Brent.  The steady state situation of high price shale oil export and low cost heavy oil import is a net positive for the economy.  Buy low, sell high.

More recently, gas pipelines to Mexico are being augmented.  This is a welcome development for the natural gas industry, still shackled by extremely low prices.  Mexico, in turn, can key on oil, both offshore and onshore, rather than shale gas.  Their conventional oil is increasingly getting heavier, and the US market is important.  In return they are buying light shale oil from the US.  All of this is important for the health of US energy production and US jobs.  This administration ought to avoid doing anything to upset this relationship, such as rhetoric on the inadequacies of NAFTA.  Pipelines are long term capital items.  Capital investment is dissuaded by uncertainty.

With the unwitting control of price on both oil and natural gas, some may see us in a dominant position.  But, more importantly, we can look forward to a North American self-sufficiency on oil and gas by about 2020.  That assures energy security and should be the modified fossil energy goal for this administration, rather than dominance.

Vikram Rao