WHY COVID-19 VACCINES ARE LIKELY TO SUCCEED

November 19, 2020 § 2 Comments

Less than a week apart, two leading Covid-19 vaccines were announced as being more than 90% effective (Federal approval requires greater than just 50%).  These numbers auger well for broad scale combat of the virus.  Does this mean that a slew of other vaccines in development are also likely to emulate this success?  Not necessarily.  Depends on the approaches the others take. These two use a relatively new approach which took less than a year from when they began to the seeming end of a Phase 3 trial, albeit limited in size; unheard of speed in the firmament of vaccines.

Conventional vaccines introduce inactivated pathogens which trigger an immune response.  Being inactivated by heat or chemical means, they cannot cause the disease. But getting them to a safe while effective formulation takes time. The immune system retains the memory of this invasion (acquired immunity) and when a true disease pathogen is detected, activates the defense mechanism.  Acquired immunity is retained for some period. This is also the reason that acquiring the disease can provide immunity for years in many cases.  Edward Jenner, the English physician responsible for smallpox vaccination, is credited with coining the term vaccination, derived from the Latin vacca for cow and, vaccinia for cowpox. Famously, infection with the relatively benign cowpox conferred immunity from the deadly smallpox disease.

SARS COV-2, the virus creating the Covid-19 disease, was sequenced in early January 2020 by Chinese doctors.  This was made available worldwide.  The virus is about 120 nano meters in size and has an external lipid layer.  Thrusting through the lipid layer are spikes, known as the spike protein (see image). 

SARS-CoV-2 transmission electron microscopy image, courtesy NIAID-RML

This protein was an obvious target for the vaccine.  In a conventional vaccine it would be only one of many proteins eliciting antibodies.  To focus on just the spike protein, investigators turned to a messenger RNA (mRNA) approach.  As the name implies, mRNA conveys the genetic message to the cell to direct synthesis of a specific protein.  This specificity likely minimizes the possibility of side effects.

Here is how mRNA-based vaccines work.   Investigators devised an mRNA strand which could reliably direct the production of the spike protein when injected into a cell. This protein is known as an antigen.  The antigen will collect on the periphery of the cell, referred to as the antigen-presenting cell.  The antigens on the cell elicit an immune response (primary response) from the body. This activates cells which acquire a memory for this detection and response in what are known as memory cells.  When the person is infected with the SARS COV-2 virus, and the pathogen presents, the memory cells cause the immune response to kill the virus.

Instability of the mRNA strand is a concern at many stages in the process.  It will not survive long in the body, so ensuring cell entry requires that it be encapsulated in some way.  The most common means is a lipid (essentially fat) capsule.  Entry into the cell is facilitated and the lipid capsule preserves it for the duration prior to that.  Once in the cell, the mRNA strand directs the production of the antigen and then degrades and is no longer a factor in the body.

In lab settings, mRNA not tailored to be more stable requires storage at -70 C.  Most laboratories employing these methods have such freezers available.  When the mRNA needs to be used for experiments, it is often temporarily stored on the bench top in containers with liquid nitrogen or even dry ice (solid CO2).  But long-distance transport is still an issue, especially in low- and middle-income countries (LMICs).  Thus, distribution to LMICs is more complicated than simply the cost of the vaccine.  These countries are best served by vaccines that are stable at closer to ambient temperature.  mRNA can be designed to more thermally stable, especially through improved encapsulation.  Currently, we know that the Pfizer/BioNTech vaccine requires the conventional -70 C for storage.  The Moderna vaccine is stated to be stable at -20 C, and for some duration at refrigerator temperatures up to +5 C.

 A dark horse that I have not seen discussed in the press is CureVac, a German firm (as is BioNTech) in Tübingen.  It is expected to commence Phase 3 trials at the end of this year, whereas the other two are already seeking emergency approval.  What makes CureVac distinctive is that their emphasis has always been on thermal stability.  This characteristic is what triggered a (pre Covid-19) USD 52 million investment in 2015 from the Bill and Melinda Gates Foundation, which is known to be particularly interested in health solutions for LMICs.  In fact, the investment agreement requires low cost availability of vaccines in LMICs. CureVac claims 3-month stability at +5 C and 24 hours at room temperature.  The tortoise could overtake the hares, at least in LMICs.

Break on through from “Break on Through (to the other side)”, written and performed by The Doors (1967).

Vikram Rao

November 19, 2020

THE FUTURE OF OIL IS OPAQUE

October 22, 2020 § 8 Comments

The future of oil has been debated for ever since I can remember.  When I was an undergraduate in engineering in the early sixties, we were taught that the world would run out of oil in 30 years.  Such predictions continued with the concept of Peak Oil oft discussed.  But, with the recognition of immense heavy oil reserves, and more recently with the emergence of shale oil, the discussion has shifted to the demand side. 

For nearly a century all crystal ball gazing centered on sufficiency of a strategic commodity.  Over the last decade or so, oil is well on its way to turning into salt*.  Lest you conjure alchemical imagery, I hasten to explain that oil is merely going the way of salt. Salt used to be a strategic commodity. Canning, and later refrigeration turned it into a useful commodity, no longer strategic.  This was about the time that the era of oil began, with the discovery of Spindletop and the resultant decimation of the price of oil. The era was abetted by the demand created by mass production of economical cars by Ford, which incidentally killed the auto industry of the time: electric cars.  More on the revenge later.

But the demise of oil will be preceded by a protracted hospice stay.  Folks will predict X% electric cars by year Y.  But that will be for new vehicles.  Legacy vehicles will go a long time, especially in countries like India, a major developing market for automobiles.  The electric starter was first installed in a Cadillac in 1911.  I was still hand cranking our venerable Morris 8 sedan in India (with difficulty; I was 6) in 1950.  On the other side of the coin, India is more amenable to conversions to electric drive, in part due to low labor cost and in part due to a way of life that wrings out every drop of value in a capital asset.

The future of oil is now being discussed relative to demand, not so much supply.  Peak oil discussions are replaced by peak consumption ones. Shale oil put paid to the supply issue. Even before Covid-19 destroyed demand, a groundswell of movement was present towards oil alternatives for transportation fuel.  This was driven by climate change concerns, but also to a degree by emissions such as NOx and particulate matter.  But the projections on future demand depend on the tint of the glasses worn.  The Organization of Petroleum Exporting Countries (OPEC) is predicting return to pre-Covid levels of consumption by late next year.  Somewhat surprisingly, the US Energy Information Administration is also singing that tune as are some oil majors such as ExxonMobil. 

Most surprisingly, however, British Petroleum (BP) is very bearish.  Their projections, while being scenario based, are causing them to plan a 40% reduction in their oil output by 2030.  This is to be combined with a big uptick in renewable electricity production.  Shares rose on the announcement.  But BP has been contrarian before, along the same lines.  Over a dozen years ago they announced a pronounced shift away from oil, renaming BP to stand for Beyond Petroleum.  That did not go well.  Particularly unhelpful to their reputation for operating in difficult environments was the oil spill associated with the massive Macondo blow out.

The future of oil is not the future of natural gas.  Together they share the term petroleum, although it is imprecisely used in the parlance to stand simply for oil.  They were both formed in the same way, with natural gas being the most thermally mature state of the original organisms.  But in usage they are different.  Oil is mostly about transport fuel and natural gas is mostly about fuel for electricity generation and the manufacture of petrochemicals, especially plastics. 

The pandemic decimated transportation fuel but had much smaller effects on electricity and less again on plastics.  In the post pandemic world, natural gas will endure for long, while oil will be displaced steadily by liquids from natural gas and biogas, and ultimately by electricity.  This, of course, excludes aircraft, which will need jet fuel for the foreseeable future.  Biomass derived jet fuel will be a consideration, but not likely a big factor.

Electric vehicle batteries costing USD 100 per kWh will be the tipping point, and we are close.  At that level, the overall electric vehicle with modest range will cost about the same as a conventional one.  The battery and electric motors’ cost will be offset by the removal of the IC engine, gear box, transmission, exhaust systems and the like.  For a compact car, each 100 miles in range will add about USD 2500 to 3000 to the capital cost. Maintenance costs will plummet and the fuel cost per mile will be significantly less than with gasoline or diesel.  To top it off, the linear torque profile typical of electric motors enables high acceleration from a stop.  A progressive shift is inevitable. The revenge of the electric car.

The only debatable issue is the rate of change.  And this is where the opacity appears in the future of oil.  The main sticky bits are perceptions of range required (and the willingness to pay for more) and charging infrastructure.  The latter could be influenced by business model innovation, such as battery swapping rather than owning.  But oil is here to stay for decades.  Therefore, improvement in efficiency, to reduce emissions per mile, are paramount.  The industry appears to understand that.  When the US administration announced a drastic relaxation of mileage standards in 2025, four major companies voluntarily agreed to a standard close to the old one.  I suspect this was in part because they already had worked out the techno-economics to get there, and certainly the consumer would like the better mileage.  Could be also that they had projections of electric vehicle sales that allowed fleet averages to be met.  A compact electric vehicle has a gasoline equivalence mileage of about 120.  Quite an offset with even a modest fleet fraction.

The oil barrel has sprung a leak.  But it is likely a slow one.

Vikram Rao

October 22, 2020

*Turning Oil into Salt, Anne Korin and Gal Luft, 2009, Booksurge Publishing

LIGHTNING STRIKES

October 8, 2020 § Leave a comment

Particulates Matter

Several readers of my wildfire blog suggested that I had given lightning strikes short shrift as causes of forest fires. In fact, lightning as a cause is not even mentioned because it is not in the top four causes, at least until 2015, as reported in the the paper I used, which is cited at the end of this piece. They plot the proportion of number of fires and area burned by cause of ignition in (a) the Santa Monica Mountains and (b) San Diego County. I am not reproducing the plot here for copyright reasons but interested folks can go to the link in the citation (Syphard and Keeley 2015) below.

Most such studies concentrate more on the area burned, than numbers of ignition, presumably because that is the net effect on the public and on the environment. Lightning as a cause is present in…

View original post 655 more words

WILDFIRES: THE WORST IS YET TO COME

September 21, 2020 § 2 Comments

California is ablaze. So are Oregon and Washington. The tally to date is 5 million acres burned, about halfway through the fire season, and well on its way to record territory. Putting that in perspective, the east coast of Australia, devastated similarly earlier this year in the Southern Hemisphere summer, closed the season with 46 million acres burned.

The statistic of greatest concern is that the intensity and scale of the fires is getting worse. Over the last thirty years, the number of fires annually has no discernible trend; certainly, has not gone up. But the acreage burned has; decisively. Both patterns are evident in the figure below. Five of the ten largest fires ever in California are currently active. The largest of these, the August Complex is already at 839,000 acres and still going. The next largest, ever, was 459,000 acres, the Mendocino Complex in 2018. Labeling any of this chance, or poor forestry management, evokes imagery of the proverbial ostrich, and the placement of its head.

This image has an empty alt attribute; its file name is wildfire-number-and-size-1988-2018.png
Courtesy US EPA 2019 (Wildland Fire Research Framework 2019-2022)

The average hectares (a hectare is roughly 2.47 acres) burned has nearly doubled over this three-decade period. Nine of the ten largest fires have occurred since the year 2000. Note that this does not include the ongoing five, which certainly would be in that group, making it 14 of the 15 since 2000. Although a regression line would have uncertainty due to big annual swings, an eyeball estimate indicates a strong upward slope. If this is a predictor of the future, that future is indeed bleak and warrants a study of causes.

The recent EPA report, from which the figure was reproduced, ascribes the pattern of increased fire acreage to higher temperatures, drought, early snow melts and historically high fuel loading (which is the fire prone vegetation, including underbrush). We will examine these separately, although they may not be disconnected. But first, a comment on the pattern of numbers of fires being essentially flat. Ignition events determine numbers of fires. In California, the principal ones are arson, campfires, power lines and equipment. The equipment category comprises items such as power saws, mowers, and other operated machinery. Human behavior, absent intervention, can be expected to be constant. So, the flat profile on numbers of fires is to be expected. Interestingly, the incidences are seasonal, even, counter-intuitively, arson.

Climate change is implicated in many of the causes of increasing severity over the years. While the term has many interpretations, one generally accepted aspect is temperature rise in the atmosphere and in the oceans. The debate is not whether this happens, but how fast it does. Also generally accepted (to the extent any climate change causality is generally accepted) is that oceanic temperature rise causes increased severity in the El Niño phenomenon in the Pacific Ocean, which is responsible for catastrophic droughts. These are accompanied by drenching rains in other parts of the world in the same year. Both disturbances are extreme deviations from the norm, with resultant impact on vegetation and the way of life.

Atmospheric temperature rise can also be expected to change the proportion of rain and snow in precipitation. Lighter snowfall can be a result, as also early snow melts. Both are in the EPA list noted above.

California, being generally arid, gets most of its water supply from melting snow. While less snow in a given year is certainly a drought indicator, the phenomenon most studied is that of the timing of the snow melt. Data from four decades commencing in 1973 conclusively demonstrated that burn acreage was strongly correlated with the earliness of the snow melt (Westerling 2016). Decadal comparisons show that the fire seasons in 2003-2012 averaged 40 more days that the seasons in 1973-1982. Fires in the later decade were more severe. Large fires, defined as covering greater than 400 hectares, burned for 6 days on average in the early decade and for more than 50 days in 2003-2012.

Power lines deserve special mention.  Falling power lines were blamed for several fires in 2017 and 2018. The utility has accepted blame and is in bankruptcy. Trees falling on power lines snapped the poles. The tree roots, finding uncertain purchase due to drought conditions, were no match for the Santa Ana winds or any other storm sourced shoves. Those same drought conditions caused the underbrush to be dry. Power lines are usually not insulated.  Sparking wires on dry underbrush and the rest is incendiary history. A poster child for distributed power.

The wildfire future is indeed bleak. Climate change retardation is necessary. But it may not be sufficient in the shorter term. We need a reincarnation of Smoky to change human behavior to minimize the ignition events.

Westerling, A. L. (2016) ‘Increasing western US forest wildfire activity: sensitivity to changes in the timing of spring’, Philosophical Transactions of the Royal Society B: Biological Sciences, 371: 20150178. http://dx.doi.org/10.1098/rstb.2015.0178

Vikram Rao September 21, 2020

AUTO INDUSTRY TAKES LEAD IN CLIMATE ACTION

August 27, 2020 § 2 Comments

A recent story in the NY Times describes four auto makers making climate action policy. OK, so that is a bit of a stretch, but not by my much. BMW, Ford, Honda, and Volkswagen cemented a binding agreement with California to limit tailpipe emissions. Volvo has also agreed to join the other four. This deal was struck despite a Trump administration’s roll back of the Obama era regulation, which had called for fleet averages of 54 miles per gallon (mpg) by 2025. The current target is 38 mpg, and the roll back regulation calls for a mere 2 mpg increase by 2025. The new California deal is for 51 mpg by 2026. The auto makers could have settled for the low target and beaten it handily. But, instead, they chose to take the high road.

This is not the behavior one ordinarily expects from industry. But these are nor ordinary times. Recently, BP announced its intention to reduce oil production by 40% by 2030, planning to produce 50GW renewable electricity by the same date. Not much detail was given and yet the share price rose in response. Green energy resonates with investors, it seems.

Much the same happened when the Trump administration rolled back yet another Obama era environmental protection scheme, regulations ensuring the curbing of fugitive methane emissions. The roll back was opposed by Shell, ExxonMobil, and BP. Methane leakage occurs at various stages of the natural gas production and distribution system. A multi-year study led by Prof. Allen at the University of Texas, Austin, sponsored in part by EDF, has identified the sources of leakage and industry has taken steps to ameliorate. This has been aided by the fact that most of the actions are cost neutral. Natural gas that leaks is natural gas that cannot be sold, so the motivation is not just environmental. Much progress has already been made, especially by the larger oil companies. A reversal of the regulation has no benefit to them. In fact, their opposition to the roll back is in part because they want to represent natural gas as a clean fuel and allowing leakage undercuts that message.

 In inking a deal with California, those five auto companies did more than merely push back. They inserted themselves into a federal versus states’ rights dispute. And this was an “in your face” move in response to the roll back, which was intended to “produce far less expensive cars for the consumer, while at the same time making the cars substantially SAFER.” The administration’s move appears to have been designed to be a win for both the consumer and the producer. The only loser was the environment. However, the auto companies must believe that the net cost of ownership will be lower with the new targets. The modest capital cost increase would be offset by the lower cost of operating a fuel-efficient vehicle. Direct fuel injection, combined with temperature measurement and multiple injection capability, can dramatically increase the compression ratio. Increased mileage combined with more muscle; a winning combination that will sell.

I suspect, however, that the bet they are placing is not only on the type of improvement I note above. This is a vote of confidence in electric vehicles, as was BP’s in the move away from oil production to generating renewable electricity. If a substantial portion of the new fleet comprises electrical vehicles, it takes some of the pressure off on improving the current fleet. An electric family sedan can be expected to have an effective “miles per gallon” between 110 and 120. But 2026 is not that far away, and possibly too soon for a major shift to all-electric vehicles. Then again, maybe not. I found the shift of the target date from 2025 to 2026 interesting, indicating a level of granularity that presages a firm plan. Hybrid vehicles already deliver the 51 mpg target. Direct injection combined with elements such as turbocharging are proven mileage enhancers.

One of the drivers for industrial responses to the Trump administration’s roll backs on Obama era environmentally themed rules has been a desire for stability in regulation. Change potentially every four years is disruptive. Logic dictates that this desire will apply primarily to those affecting capital investment with long cycles. Certainly, automobile design changes and oil exploitation of the Alaskan National Wildlife Refuge fit that bill. Methane emissions somewhat less so, but still involving changes in equipment and procedures. Warning labels on machinery probably lie on the other end of the spectrum.

California represents a solid 30% of the automotive sales market. Thirteen other states are committed to California regulations and add to that percentage. Unless President Trump succeeds in taking away states’ rights in this regard, that pretty much means the entire country will be using these vehicles, no matter the regulations in the individual states. This has happened before. Devices such as television sets consume power while on standby. A scant 15 years ago, that could have been as much as 20 watts. Based on research conducted at the Lawrence Berkeley National Laboratories, it was concluded that the full functionality could be achieved for as little as 1 watt, at virtually no extra cost; you just had to design it more smartly. California mandated that in 2006 (I think it was 3 watts at the start) and the rest of the country followed. At that time, a full 10% of all national electricity consumption was this sort of wasteful passive use. Since then, we have taken other actions to reduce passive consumption.

It began with the 1 watt rule, unilaterally laid down by California. Perhaps that bell will get rung again in tailpipe emissions.

Vikram Rao

August 28, 2020

THE ARCTIC IS COLD, RENEWABLE HYDROGEN IS HOT

August 20, 2020 § 4 Comments

Oil drilling leases will soon be available in the Arctic, according to a story in the New York Times. The Alaska National Wildlife Refuge (ANWR), a land-based portion of the Arctic, is cited. But the Arctic is cold, both figuratively and literally. When he took office in 2017, President Trump announced a roll back of a “permanent” ban on Arctic drilling that President Obama instituted as he was leaving the White House. I opined then that the roll back would have no net effect because interest from oil companies would be minimal. I also wrote at the time that President Obama’s action was also largely symbolic, and not material.

The principal reason for these conclusions is that the price of oil has been low since 2015, when US shale oil became the determinant of oil price in the world and the ability of the Organization of Petroleum Exporting Countries (OPEC) to prop up prices was deeply undercut. USD 120 per barrel highs became USD 70 highs. The Covid-19 pandemic has decimated shale oil company ranks, but it has also caused demand, and price, to plummet to historic levels. Accordingly, the crystal ball of future oil prices is murky. Murky crystal balls equate to uncertainty, which, added to the environmental risks, further equates to higher discount rates. Making matters worse on the investment side, any Alaska play has a long-term payout. First oil is likely a decade after the lease purchase. This involves forecasting the price of oil into the second half of the century.

All the indications are that oil demand will reduce significantly by 2040, largely through electric vehicle adoption. Certainly, the super-major oil company BP’s beliefs in this regard have translated into plans for a major replacement of oil revenue with revenue from renewable electricity. They recently announced that by 2030, their oil production will be reduced by 40%, concurrent with major investment in renewables, resulting in 50 GW electricity production. That production is up there with good size electric utilities. This decision also comes at a time when the dividend has been halved and properties divested to raise cash. It also is coincident with the divestiture of their pioneering Alaska North Slope holdings to privately held Hilcorp, during which transaction they sweetened the pot with a loan to ensure closure of the deal. This does not sound like a company that will invest in a US Arctic lease. I do not see any oil company headquartered in Europe doing it either.

Hydrogen is an important industrial commodity even not counting the possible use as electric vehicle fuel. US refineries purchase 2 billion cubic feet per day of hydrogen (in addition to using another 0.5 billion cubic feet produced internally). Virtually all of it is produced from natural gas. As we discussed in these pages earlier, hydrogen produced using surplus electricity during low demand periods is one of the most promising solutions for the problem of  intermittency of renewable electricity. Oil companies like BP, doubling down on renewables, are unlikely to miss this point. Also, if conversion to ammonia is more appropriate for storage and transport, who better positioned than an integrated major oil company? In its announcement, BP makes a vague reference to hydrogen. No mention is made of geothermal electricity, but it is highly unlikely they are not watching that space.

Returning to the issue of success of a lease sale in the ANWR, one of the primary challenges is the paucity of high-quality seismic data. These are subsurface images acquired by individual oil companies in proprietary shoots or by seismic operators speculatively shooting to then sell subscriptions to the data in “libraries”. The acquisition and interpretation of the data is the edge employed by oil companies in obtaining the winning bids without overpaying. Less data means more uncertainty. My take on the situation is that there will be fewer bids due to competing capital spend directions, the uncertainty in the price of oil, the environmental risks, and the delays likely due to litigation (case in point the litigation based delays in the Keystone XL oil pipeline construction). But whatever bids that materialize are likely to be low-balled. In that case, the revenue from the sale will be underwhelming. This assumes, of course, that the administration goes ahead with plans to auction the tracts. More than likely this is just another tempest in the Alaskan teapot.

Vikram Rao

August 20, 2020

FIXING CARBON DIOXIDE

August 9, 2020 § 4 Comments

This discussion is about fixing, as in solving, but it is also, and mostly, about fixing as in rendering immobile. The impact of the greenhouse gas CO2 can be mitigated either by producing less or by capture and storage. A recent paper in the journal Nature triggered this piece. It discusses the feasibility of fixing CO2 in the form of a stable carbonate or bicarbonate by reacting atmospheric CO2 with minerals in the volcanic rock basalt, one of the most ubiquitous rocks on earth. Crushed basalt is to be distributed on farmland. The bicarbonate fraction is water soluble and run offs take it to the ocean, where the alkalinity mitigates ocean acidification. The reaction products are also a desirable soil amendment. This paper is mostly not about the technology. It studies scalability and the associated economics. The authors estimate the process can be accomplished at a cost ranging from USD 80 to 180 per tonne of CO2. Putting that in perspective, the current US regulation has a 45Q Federal Tax Credit of USD 50 per tonne sequestered in this fashion. This lasts for another 12 years. While no business ought to be built on the promise of subsidies, the length of time allows cost reduction to occur. At USD 80, the lower end of the range noted by the authors, the cost is in an acceptable range.

The use of basalt to fix CO2 is a part of the genre referred to as mineralization of CO2. Divalent species, but principally Ca and Mg, are present in rocks. In low pH conditions they react with CO2 to produce a carbonate (or bicarbonate). Olivine, another common mineral, often found in association with basalt, is a mixture of MgO.SiO2 and FeO.SiO2. The reaction product is MgCO3 and SiO2. For CO2 sequestration purposes this may be accomplished in situ or ex situ. The term sequestration most properly includes both capture and storage, but is often used just for the second step, and that is how we will use the term here.

A promising approach for in situ storage of CO2 is injection into oceanic basalt deposits. Basalt is formed when the magma from volcanic eruption cools rapidly. When it cools slowly, it produces species such as granite, with large crystals and high hardness, a rock more suitable for structural applications. Basalt on the other hand is fine grained and weathers easily. This is good for reactivity. In oceanic deposits it is even more so the case when the rapid cooling in water results in “pillows”, which partially disintegrate to be permeable. They are often overlaid with later placements of magma sheets. These impermeable layers act as barriers to injected CO2 escaping, affording time for mineralization. The mineralization is further accelerated if the injected CO2 is in the supercritical state (achieved at greater than 31 oC and 1070 psi). All fluids in this state have properties of both gas and liquid. Here the supercritical CO2 permeates the rock as if it were a gas and reacts with the mineral as if it were a liquid.

Ex situ fixing of CO2 follows the same chemistry as in situ, at least in the aspect that the product is a carbonate. The raw material can be tailored to the need if cost permits. The CO2 capture cost is the same in either case. However, an ex situ process has many advantages over in situ ones. The process kinetics can be advanced using higher rates of reaction using standard process engineering methods such as fluidized beds. Catalysis could also be employed. The products could also be expected to have value, such as in substitution of concrete ingredients. But, as in the case of fly ash from coal combustion, also a simple additive to concrete, the realization of that value can be elusive. Niche uses can be found, but monetization on the massive scales required to make a dent in climate change will require concerted effort.

The cost of production will still dominate the economics and the largest component of that is the acquisition of CO2 from the industrial combustion process or air. Air capture is a relatively recent endeavor and targets production cost of USD 100 per tonne CO2, at which point it becomes extremely interesting. The principal allure of this method is that it can be practiced anywhere. If located near a “sink”, the utilization spot, transport costs and logistics are eliminated. This underlines a key aspect of ex situ sequestration, the availability and cost of CO2 in the form needed.

The original premise for this discussion, mineralization of CO2 from the air, skips the CO2 acquisition constraint. But the focus shifts to the procurement of massive quantities of rock and crushing into small particles. Two pieces of good news. One is that basalt is possibly the most abundant mineral on earth, although a lot of it is at ocean bottoms. The other is that basalt crushes relatively easily, especially if weathered (contrasted to its country cousin granite). But the elephant in that room is that procurement still involves open pit mining, anathema to environmental groups. In recognition of this, the authors of the cited Nature paper encourage a study of availability of tailings from mining operations as basalt substitutes for oxides of divalent ions. They opine there are vast hoards of such tailings from mining operations over the years. They also suggest the use of Ca rich slags from iron making. These are oxides of Ca and Si in the main, with some oxides of Al. Lest this idea be extrapolated to slags from other smelting operations, a caution: the slags from some processes could have heavy metals and other undesirables such as sulfur. On the plus side of that ledger, the processing of certain nickel ores entails a beneficiation step that results in a fine-grained discard rich in Mg silicates, which ought to be very reactive with atmospheric CO2.

While the use of industrial waste for sequestering CO2 is technically accurate, acquisition and use of alkaline earth rich oxides will have hurdles of location, ownership, and acceptability to farmers, to name just a few. I am also reminded of the fact that when “waste” products with no or negative value create value for someone else, the price will often be revised, upwards. But the method in the cited paper certainly is a useful addition to the arsenal of measures to mitigate global warming, provided field operations verify the predictions on rates of reaction. This battle will only be won with many different arrows in the quiver.

Vikram Rao

August 9, 2020

RENEWABLE ENERGY: THE HYDROGEN SOLUTION

August 2, 2020 § 10 Comments

The two principal sources of renewable energy share a serious shortcoming. As has been discussed in these pages over the years, wind and solar do not generate electricity when the wind does not blow, and the sun does not shine. Germany gets 40% of its power from renewable sources. But on certain days, that percentage jumped up to 75% and on other days it plummeted to 15%. The (literally) rainy days had electricity augmented from a variety of sources, including batteries. But the days of surplus sometimes required idling of the generation.

 Great advances have been made in lowering the cost per unit in both wind and solar. But the need to level the load has never been more important because those very advances have increased the footprint. Some have rushed to use natural gas generators to fill the intermittency gap. This has caused consternation, with some positing the notion that renewables perpetuate fossil fuels because of this dependency. This concern ignores the fact that storage is being investigated at many levels.

Electrochemical storage is the only reasonable option for devices that are carried or move. In many cases, the options are even more limited to light weight batteries. But stationary applications have other options. One that has been in use, where feasible, is pumped water storage. Excess electricity is used to pump water to a high storage site, such as at a dam. When needed, it flows back down to generate electricity. Danish windmills utilize Norwegian hydroelectric sites for this purpose.

The flavor of the day is hydrogen. Excess electricity is used to electrolyze water, producing hydrogen and benign oxygen. The hydrogen may be stored on location to be used to power turbines to produce electricity when needed. In this it serves a similar purpose as does natural gas for the back up generators. As in the case of natural gas, the relatively low duty cycle stretches the pay back period of the capital equipment. Efforts are under way to reduce capital and operating costs. In the former area, expensive platinum electrodes are being replaced with base metal with novel coatings. Operating efficiency improvements are also being targeted. By its very nature, the method is conducive to small scale distribution. Electrolysis to produce hydrogen may be here to stay.

Produced hydrogen could find applications other than for generating electricity.  An interesting variant has been piloted for over a year in Cappelle-la-Grande, a town in northern France, by the energy firm Engie, where the hydrogen is blended into existing natural gas pipelines. Hydrogen is a very small molecule and initially there were concerns regarding leakage. But a 25% blend was found to be retained and did not materially corrode the pipes. Furthermore, household burners were found to operate efficiently with that mix. In fact, the mix produced a cleaner burn. Most European countries permit the blend. Some are considering repurposing natural gas lines to exclusively distribute hydrogen.

 Hydrogen is an important reagent used in all refineries.  Hydrogenation of edible oils is another application. But the workhorse application for this source may well be the admixture into natural gas lines for domestic and industrial use. Because of the low volumetric energy density of hydrogen, storage of hydrogen in the form of ammonia is also being considered. The liquid is easily stored and transported under conditions similar to those for propane. The conversion to ammonia, using nitrogen from air, is straightforward. Utilization can be directly as a fuel in an internal combustion engine, or by catalytic dissociation back to hydrogen for use in that form in a fuel cell for an electric vehicle or any other purpose. Research is under way for improvements in this space, including ammonia production at lower temperatures.

Pipeline transport of hydrogen is feasible but expensive, especially for small volumes. Ammonia, on the other hand, can be transported in pipelines at a cost of about USD 0.20 per kg hydrogen per 1000 miles. This is less than 5% of the expected cost to produce renewable hydrogen at solar and wind installations. The US currently has nearly 3000 miles of ammonia pipelines. Ammonia is a leading candidate for renewable hydrogen storage and distribution.

The main takeaway from this discussion is that renewable energy requires storage, and that storage in fluid form is likely to lead the way. An alternative to using the stored fluid to generate electricity is to use it for a different purpose. This solution for monetizing electricity from periods of excess supply would require the supply troughs to be augmented from another grid source. Hydrogen and ammonia will be important players in the renewable energy world. Alas, silver bullets went out with the Lone Ranger.

Vikram Rao

August 2, 2020

DISRUPTIVE IDEAS

June 22, 2020 § 3 Comments

In a New York Times story, Taylor Branch, a historian of the civil rights era is quoted as saying: “A movement is different from a demonstration. It’s not automatic – it’s the opposite of automatic that a demonstration in the street is going to lead to a movement that engages enough people, and has a clear enough goal that it has a chance to become institutionalized, like the Voting Rights Act”.

He was discussing a social movement that challenged the orthodoxy. Demonstrations, even ones with a coherent and unified message were unlikely to persuade a majority. But, once the message took hold, it would be the new orthodoxy.

This has striking similarities with the concept of disruptive technology, a term coined by Clayton Christensen exactly a quarter century ago. Considerable detail is to be found in his very readable book Innovator’s Dilemma, but all you really need to know is in the (free!) 1995 Harvard Review paper by Bower and Christensen. A disruptive technology is one that initially is rejected by industry as not a good fit or too unreliable and costly. After success in niche applications, the appeal broadens. Eventually, it becomes the norm and usually completely displaces what preceded it. It is now the new orthodoxy in that technical space. Hence the term “disruptive”.

I lived through the development of one such technology in the oil and gas space. My company, Sperry Sun, had been a leader in developing the technology of horizontal drilling and the enabling technology of measurement while drilling. Early horizontal wells cost 2.7 times conventional wells of the same length. But they multiplied production in the Austin Chalk, by intersecting oil-bearing vertical fractures. That business segment put up with the teething pains for the value created. A U S Department of Energy survey showed that in a few years horizontal wells cost just 17% more and delivered 2 to 7 times the production. Eventually, it was the key enabler for the development of heavy oil in Canada and for the shale oil and gas boom in the US. A Shell Oil Company executive once confided in me that in the early going one needed permission to plan a horizontal well, but by the late 1990’s, one needed permission not to use horizontal wells. That pretty much defines a disruptive technology: looked at askance at first, becoming the norm afterwards.

Being a techy, I may be ill qualified to opine on matters that follow. But I propose to do so just the same! Caveat emptor! Ideas that upset and transform societal norms appear to be have underpinnings similar to those of disruptive technologies. Ideas initially appeal to just a minority of the populace. The appeal may be broader, but the only a minority may be undaunted by the enormity of the task of getting wide acceptance. In the civil rights era, it took decades for that to happen. Today, communication technology and especially the social media variant have changed the game. This may in part explain the rapid breadth of the movement for reform in policing. Or could be that the incendiary pile had grown with a succession of events and just needed the spark.

The descriptor Defund the Police is unfortunately worded if broad acceptance is the objective. A better one would possibly be Reform Policing. My take on what it means, or at least what I think it ought to mean, is for policing to emulate medicine in addressing both the symptom and the cause. Eliminating police departments is as absurd as outlawing doctors. But more emphasis ought to be placed on addressing the underlying causes for crime. This certainly already happens to different degrees in many jurisdictions but is clearly not the norm. Funding that ordinarily would simply go to enforcement, ought to be diverted in part to ameliorating the causes of crime.

The other layer, that of codifying behavior by individual police-persons to be more humane, while still protecting their own selves, is certainly needed as well. Leadership is coming from many quarters, including police officers. Houston’s police chief Art Acevedo was recently quoted as saying, “It’s not about dominating, it’s about winning hearts and minds.”, in a clear reference to one of President Trump’s comments on the subject.

Disruptive technology is one of the best-known terms in the lexicon of innovation. Here, disruption does not carry the plain English pejorative connotation. So also, should it not in the term Disruptive Ideas.

Vikram Rao

June 22, 2020

FOSSIL FUEL SAVES THE ENVIRONMENT

June 9, 2020 § Leave a comment

When was the last time you saw a headline such as this? Probably never. While indulging in a modicum of hyperbole, as you will see, the headline is not too much of a reach. The environment here is that experienced in household air pollution (HAP) and is the direct cause of an estimated 3.6 million deaths annually. Substantial radiative forcing is also expected from the elemental carbon emissions, with the HAP source estimated to provide 20% of the loading worldwide, and a much higher proportion in Asia. Radiative forcing has a direct impact on climate change.

3 billion persons use biomass as a cooking fuel, almost all in low- and middle-income countries (LMIC’s). The biomass is largely the wood of convenience but can also be animal waste (dung). In countries such as the Sub-Saharan Burkina Faso, 95% use biomass for cooking. Improving cookstoves has been a pursuit for decades. While improving efficiency of the stove does reduce the emissions per cooking episode, the overall reduction is not sufficient for a significantly favorable mortality outcome. For a clue regarding the reason for this, consider that the PM2.5 count can be as high as 600 μg/m3, as compared to the WHO guideline of 35 μg/m3, and the US standard of 12 μg/m3. Some of the “improved” stoves reduce the emissions by 60%. That does not cut it. To make matters worse, the emissions/impact curve is supralinear, meaning the impact curve flattens at high exposures, despite being linear at the very low numbers. This means that the gains are relatively small for exposure reductions from high to moderate numbers.

This relatively recent realization (a 2014 publication) has led to interventions involving complete substitution of the biomass fuel. The leading candidates are alcohols, liquefied petroleum gas (LPG) and electricity. Electricity is not a good idea. Target villages lack electricity for the basic necessities of lighting, fans and cell phone charging, with the occasional refrigerator. Diverting what little is available to cook stoves, when other alternatives exist, is a bad idea. Ethanol is too expensive and in many countries the production would compete with a food use. LPG has become the favored substitute in many countries.

LPG is a mixture of propane, butane, and some larger molecules. It is derived from oil and gas production. It is delivered in pressurized cylinders, typically holding 14.2 kg fuel. The particulate emissions from an LPG stove have been observed to be close to the WHO guideline of 35 μg/m3, compared to up to 600 μg/m3 with traditional fuel and stoves. Many countries have doubled down on this fuel substitution. India has programs for distribution of stoves and substantial subsidies on the fuel.

But the health benefits from this substitution have not been quantified in randomized control trials until recently. Many of these are ongoing. The largest of these is the USD 30 million Household Air Pollution Intervention Network (HAPIN) trial in 3200 households in India, Rwanda, Guatemala, and Peru. In a recent advance in the state of the art in PM2.5 monitoring, a small wearable device, RTI’s Enhanced Children’s MicroPEMTM is utilized. This follows the personal exposure on pregnant women, other adult women, and children under 1 year of age. This cohort is the most affected by HAP caused by cookstoves. Carbon monoxide is also measured in the cooking area. Studies have shown that the total PM exposure captured by the wearable monitor is usually less than would be measured in the ambience of the home. In any case, the monitor comes closest to determining what the person breathes and will likely become the standard of practice in trials. The filters in the monitors are archived and the collected PM may be used in in vitro studies to assess the toxicity of the particles.

Even if the health benefits of LPG substitution of biomass are established, issues remain. Almost all the affected LMIC’s are net importers of LPG. The price is pegged to the marginal kg, which is basically the world price. Propane pricing may be used as a proxy for LPG. Natural gas liquids, including propane, generally track the oil price. Short term volatility in the price of oil has become a way of life since about 2014. In the US, propane has been priced as low as USD 3.63 per million BTU in January 2016 and a scant two years prior to that was at USD 15. These fluctuations will be very hard on the poor in villages, even if respective governments act to ameliorate with subsidies. The practice of “stacking”, comprising switching back to wood, at least in part, will vitiate the gains.

LPG, undeniably carrying the label of a fossil fuel, may well be the means for improving air quality for the poorest and for addressing the single biggest public health problem in the world today. Labels can be deceiving.

Vikram Rao