February 26, 2021 § Leave a comment
The title is adapted without a modicum of shame or contrition from a Paul Krugman New York Times opinion piece. While you will see my engineer’s spin on all this market economics stuff in this piece, his linked opinion is a must read. At least take in the brilliant title: Et Tu, Ted? Why Deregulation Failed. The Julius Caesar/Brutus reference aside (inspired, I imagine, by the Ted statement undercutting a principal tenet of Texas energy policy), it is a clear exposition of when free markets fail. From a Nobel Laureate in Economics, no less. The Ted mentioned is of course Senator Ted Cruz, whom Krugman in droll fashion describes as R-Cancun. The press being woefully bereft of scandalous behavior by elected officials (only so much ink you can give to that Congresswoman) is giving the full treatment to Cruz, his wife Heidi (managing director at Goldman Sachs), St. Johns High School (elite private school in Houston) and group chat.
As you know from my previous take on the Great Texas Freeze, Texas has an independent power grid. It is run by ERCOT, a non-profit agency regulated by the state. By not crossing state lines, it avoids being regulated by the Federal Energy Regulatory Commission. This independence from the feds has been fiercely defended by many in current and former elected office. The regulations, such as they are, were reputedly fashioned largely on concepts laid out by Harvard professor William Hogan. They allow wholesale prices to go up to a maximum of USD 9 per kilowatt hour (kWh). The average consumer price in Texas normally is around 12 cents per kWh, and closer to 9 cents in winter. In other words, regulations allow householders to be charged up to 100 times more than normal (wholesale and consumer prices are not directly comparable, but the multiplier is still huge). Avocados (Krugman’s favorite for the argument I am about to make) doubled in price during a shortage in 2019. Consternation in Mexican restaurants (and the holy guacamole jokes) aside, consumers could simply eschew the green fruit (yes, it is a fruit, as is the tomato, despite an 1893 Supreme Court decision decreeing it a vegetable!). Lowered demand and some remedy on the supply side restores prices. The way it is supposed to work.
But electricity access is not like avocados. When used for heat in winter it is a necessity, not a choice (even a natural gas fueled heating system needs electricity to run the blowers). A period without guacamole in the diet would scarcely register on the privation scale whereas sustained frigid temperatures could be life threatening.
But the strongest argument for more regulation ensuring supply is that the supply chain is interdependent. A turbine operator cold hardening the blades in the expectation of profiting from the high prices during a freeze caused shortage may have fuel supply interrupted, vitiating the business strategy. In the case of natural gas, the primary fuel for electricity in Texas, the cold hardening would simultaneously have to be done by at least the gas producer, the midstream operator (of the pipeline) and the electricity generator. Just any one of them doing it may not realize the profit uptick while still incurring the cost.
Texas politicians are in no win positions. They favor free-market methods but must deal with the fallout from crushingly high electricity bills faced by some consumers. The tweeted statement by Senator Cruz that led to the Et tu Ted opinion by Krugman was: “This is WRONG. No power company should get a windfall because of a natural disaster”. The problem with that statement is that it flies in the face of the basic de-regulated model that private companies are incentivized to spend the money to harden for these situations in hopes of high profits during those interludes. On the assumption that Senator Cruz understands the model, his statement appears to be a repudiation of a tenet of Texas energy policy. Hence the Et tu (meaning you too)wording in the Krugman piece (Julius Caesar utters Et tu Brute in anguish when he sees that his friend and protégé Brutus is one of the assassinating senators).
Many have taken the position that one cannot spend the money to prepare for the infrequent occurrences. Hard to argue with this in principle. However, I will give you the example of El Paso, Texas. After a freeze in 2011 of similar scale as this one, the state studied the matter. The city of El Paso simply acted. They had previously divorced themselves from ERCOT and were joined with the Western Interconnection. With no compulsion to act as the rest of Texas, they spent nearly USD 4 million to harden their grid to -10 F. They built a dual fired power station, using oil or gas. This time around, in a city with 680,000 residents, fewer than 1000 customers had electricity black outs for greater than 5 minutes. Since rate payers always pick up the tab, what then was the impact on rates? The average residential rate in El Paso is 11.11 cents per kWh, compared to the average for Texas of 10.98 cents. A straight up comparison is difficult, but those figures are not very far apart, 1.1% to be exact.
Was the privation suffered in the great Texas freeze an indictment against free-market models of electricity access? Probably not. But economists advising the state must conjure up a framework with built in elements of social responsibility and a greater recognition of the inter-dependencies in the supply chain. The folks least able to withstand extended loss of power and/or costly energy for survival, need a safety net. Another such devastating disaster, to borrow another Roman analogy, will have state leaders facing charges of fiddling while Texas froze.
February 26, 2021
February 18, 2021 § 14 Comments
Texas prides itself on being the energy capital. The capital (as opposed to the Capitol of the infamous January 6 insurrection) is under siege. Nature is asserting its might. Unpreparedness sure helps.
Few know that Texas has its own grid. The country is divided into three grids: The Eastern Interconnection, the Western Interconnection, and drum roll here, Texas. Conspiracy theorists may connect this to secessionist tendencies. Certainly, recent utterances attributed to the former governor Rick Perry don’t help. He is quoted as saying, “Texans would be without electricity for longer than three days to keep the federal government out of their business,”. He is referring to the fact that because the Texas grid does not conduct interstate commerce, it is not governed by the rules of the Federal Energy Regulatory Commission. This from a guy who just a month ago held federal office as the US Secretary of Energy.
In a Fox channel interview Governor Abbott of Texas blamed solar and wind for the problem. Small problem: solar is just 1 – 3% of the total and wind is around 20%. Then his own folks at ERCOT, which stands for Electric Reliability Council of Texas (the reliability in the name is ironic) said it was primarily due to natural gas supply drop. This makes more sense because gas generators comprise 47% of the electricity produced. Abbott later walked back the claims and said he meant that renewables could not be the dominant source. Tell that to Germany, which gets 40% from renewables. Then Congresswoman AOC trolled Abbott by Tweeting that Texas electricity was 80 – 90% from fossil fuel. That is not accurate either (coal plus gas come in at about 65%, according to ERCOT). Just when you think the election silly season is over, you have politicians using their favorite political points scoring issue whenever there is a remote opening for it.
By all accounts, every source of electricity was hampered by the extreme cold, even the nuclear plants. But, according to the ERCOT leadership, the biggest culprit was natural gas. Delivered natural gas nearly halved at the most severe stages due to frozen lines. We know that methane (the dominant component of natural gas) does not freeze till a frigid -182 C. So, why are natural gas pipelines (these are the main supply lines, not the little ones going to your house) freezing?
I was not able to find any explanation, so I am going to hazard a hypothesis based on other oilfield knowledge. Almost all supplies of natural gas will be water wet to some degree. If films of water form, at pipeline pressures of 800 psi or so, temperatures approaching water freezing can cause hydrate crystals to nucleate on the walls. Again, with the right conditions, these could grow to plug the line. This routinely happens in undersea gas pipelines. Those pipelines have a device known as a “pig” which can be made to traverse the line and mechanically clear out the growing crystals. The other means is to drizzle in methanol, which lowers the freezing point; basically an antifreeze such as ethylene glycol in your car radiator (which too can be used in this application).
Gas hydrates are large crystals of ice with methane in the interstices. The overall crystal structure looks like a soccer ball. Richard Smalley, who co-discovered this structure in carbon (a sixty-atom molecule version), got the Nobel Prize for it, in part because finding a brand-new crystal structure of a common element is rare, and in part because carbon in this form has proven to have compelling value in the form of nano materials. Gas hydrates in the subsurface were once believed to be the next big thing in natural gas sourcing because they are ubiquitous and, according to the US Geological Survey, the total resource exceeds all other carbonaceous fuels combined. Some probably still are believers. In my opinion plentiful shale gas has shot down those dreams. Gas hydrates are also a neat party trick. Take a block of it in a shallow bowl and the seemingly innocuous ice will light up with a match.
We can conclude from all that we have seen in Texas that industry, especially a loosely regulated one, operates on probabilities. ERCOT modeling probably predicted such freezes to be infrequent and more geographically scattered, allowing the management with a minimum of disruption. Not the way it turned out. Last year a high proportion of the devastating wildfires in California were known to have been triggered by downed power lines. A cost-effective solution is yet to be identified. The Lone Star is not alone after all.
February 18, 2021
February 10, 2021 § Leave a comment
A hydrogen economy replacing the current hydrocarbon economy is fanciful at best. But displacing in chunks, that has legs. First the basics on terminology. Energy for our way of life is dominantly produced from fossil fuel, with less than 15% from other sources. That is the basis for defining the current state of affairs as the hydrocarbon economy. Hydrocarbons use results in CO2 and other emissions. Any reasonable expectation of avoiding serious global warming involves one or both of two alternatives: sequester emitted CO2 at source and/or from the air or reduce the use of hydrocarbons. Hydrogen is seen as the route to the second means, but the hydrogen cavalry is at best only just gearing up.
When the cause of hydrogen was advanced in the past, such as in the use of hydrogen as motive power for electric car motors via fuel cells, I had been bearish. Then, as now, 95% of hydrogen was produced from natural gas. The process is known as Steam Methane Reforming (SMR). The first step is partial oxidation of the CH4 to CO plus H2 and then further reacting the CO with water in the water gas shift reaction to CO2 and H2. The net result is the production of hydrogen and CO2. 9.3 kg CO2 are emitted per kg H2 produced.
Each kg H2 contains 34 kWh of energy. A gallon of gasoline also contains about 34 kWh of energy. Combustion of a gallon of gasoline emits about 9.1 kg CO2. What then is the allure of hydrogen to power vehicles? The answer lies in the inherent efficiency advantage of electric motors over internal combustion engines powered by gasoline. The result is that the same 34 kWh in hydrogen will permit a driving distance of 70 miles for a vehicle that would go 25 -30 miles with the same energy content gallon of gasoline. The other factor is that hydrogen powered EV will have zero tailpipe emissions, whereas all the 9.1 kg CO2 per gallon of gasoline will be released at the tailpipe. The CO2 emissions from SMR will be at a location allowing capture and disposition.
Tailpipe capture of CO2 is technically feasible. RTI International invented a process using adsorbents, with the CO2 periodically desorbed for controlled release into a canister. Part of the difficulty is in the management of the canister handling and the logistics of CO2 disposition. This nuisance factor is not unlike the issue of using urea canisters in diesel vehicles as a NOx capture means. The nuisance and expense of that option led to the infamous VW avoidance scheme. In any case, the technology has not been adopted to date.
Traction in the use of hydrogen to displace fossil fuels has been found in the use of hydrogen as a storage means in support of renewable electricity from wind and the sun. Both these forms do not match usage load profiles. Germany, which has a renewable component of 40%, has days on which it is as high as 75% and as low as 15%. This variability dictates the need for storage mechanisms. Batteries are an expensive solution. Hydrogen is increasingly seen as the medium of choice. It may be produced by electrolysis of water using essentially waste electricity during the low load periods. The hydrogen could then be stored for combustion for power during the high load episodes. It could also be converted to ammonia, utilizing nitrogen from the air. Ammonia is more energetically dense, and this is a good option if transport is required. At the receiving end it could be used for a purpose such as fertilizer manufacture, burned for power or cracked to hydrogen (and nitrogen, which is released to the atmosphere) for an industrial use such as hydrogenation of vegetable oil.
Europe is experimenting with putting hydrogen into natural gas pipelines. Hydrogen is a very small molecule and difficult to contain from diffusing away. It also can embrittle steel pipelines. But at 20% dilution of natural gas, the corrosion is tolerable, and the resulting gas mixture is suitable for all purposes designed for plain natural gas. The displacement of 20% natural gas constitutes a nibble at the hydrocarbon economy. The energy giant Engie has piloted this in a town in France.
For a colorless, odorless gas, hydrogen certainly has a lot of color in its classification. Normal hydrogen from fossil fuel sources is considered gray. Electrolytic hydrogen using renewable electricity would be considered green. Hydrogen from natural gas fed SMR would be considered blue, if the CO2 associated were to be sequestered. Hydrogen may be produced from biogas, much as it is from natural gas. If the resulting CO2 is firmly sequestered, the hydrogen is considered green.
Hydrogen will continue to chip away at fossil fuel use. But a wholesale shift to a hydrogen economy remains aspirational *. An interesting wild card is the possibility of copious geologically sourced natural hydrogen. But even this will likely not produce a winning hand.
For more on the topic: https://www.youtube.com/watch?v=5qm3UbUrYWk
February 9, 2021
*Do you believe in magic by The Lovin’ Spoonful (1965), written by John Sebastian
January 12, 2021 § Leave a comment
January 6 will go down in infamy for the obvious, an extraordinary and unprecedented assault on the will of the people. But it was also a failed gambit. The failure of two other gambits was revealed on that day as well. Neither was in the same league, and with luck none ever will be. But they are certainly worthy of note because they are windows to human nature in the election season. In any other year, one of them would have been headline news. Until the storming of the Capitol, I used to refer to all that preceded every January certification day as the silly season. Levity does not wear well with the main event of that day. But the under card (minor apologies for the pugilistic analogy) events can tolerate a lighter touch.
The New York Times headline on the front page of the January 7 issue assigns culpability for fomenting the mob. Here we do not do that, sticking with a long-standing practice of this blog site to attempt to steer clear of positions in political matters. We merely discuss the gambit, no matter who instigated it, which appears to have been storming the Capitol with intent to somehow overturn the results of the presidential election. Whether this was to be with the simple imposition of will or the actual destruction of the ballot boxes containing the electoral college votes, we may never discern. This had all the hallmarks of rabble rousing, as opposed to a carefully planned and executed campaign. Having said that, history is pocked with instances of rabble rousing that took on lives of their own, with more lasting effects.
The other gambit was played in Georgia in early November and the result was revealed on January 6. This was a decision to hang onto President Trump’s coattails despite his having lost the election in that state. Both Senate incumbents decided to do just that, although Kelly Loeffler was more strident, pointing out her perfect record of voting with the President. Many others across the country had ridden that strategy to victory. In fact, in something of an anomaly, the coattails grabbers performed better than the coat wearer. But both Loeffler and Perdue did not reckon with the President litigating (both literally and figuratively) his election loss, by basically asserting that the outcome could be reversed. This took away the best argument for voting for Loeffler and Perdue, which was to preserve the Republican majority in a senate with Biden as president, an essential means for tempering the new president’s agenda. As the weeks wore on, he attacked the governor and secretary state of Georgia, both Republican, for somehow being responsible for his loss. The final straw was when he demanded support from both incumbent senators for a challenge of the electoral college votes in the January 6 certification in the joint houses of Congress. With the die already largely cast, they went along with the political version of a Hail Mary (touchdown) pass. Even that characterization is charitable given any reasonable interpretation of the Constitution. When the dust settled and the Georgia votes were counted on January 6, the same fateful day of the mob attack on the Capitol, both incumbents had been defeated in what most considered to be an upset verdict. The margins of victory were even wider than the 0.5% which could have enabled a recount.
The third gambit pales in comparison and deserves discussion principally because of the temporal coincidence of it playing out on January 6. But the gambit was conceived and announced in early 2017, shortly after Trump was elected president. It was also shortly after President Obama, literally on his way out of the door, issued an order to “permanently” ban offshore Arctic oil and gas lease sales. My opinion of that play is in a 2017 blog. For President Trump this was an attempt at showing support for the oil and gas industry. The only problem was that the smart money has known, at least since the plummet in oil prices in 2015, that the Arctic was too risky. On pure economics, leave alone considering the environmental impact. I opined a few months ago that the interest in the leases would be very low. But the gambit had to be played out. Possibly nobody told the President that this could prove embarrassing; is it still a party if (almost) nobody comes?
And it was embarrassing indeed when the sealed bids were opened on that otherwise fateful January 6. Only 11 of the 22 tracts offered had bids. None of these were from super majors, majors, or large independents. All were at or near the minimum allowed in the auction. 9 of 11 were won by a division of the state of Alaska, not an oil drilling and production entity. The other two were very small independent oil companies. All this in the name of supporting the oil and gas industry. If only they had been consulted.
January 12, 2021
December 6, 2020 § 1 Comment
Outgoing Presidents are making a habit of throwing long passes with the Arctic football. And like all long passes, the probability of being caught is low. But they differ in character. When President Obama issued an executive order about this time in 2016, it was to “permanently” ban future lease sales in US Arctic waters. At the time I wrote that this was purely symbolic, with no real impact, because even if leases were offered for sale, this would be a party to which hardly anybody would come. When President Trump assumed office, one of his first declarations was the intent to sell leases in the Alaska National Wildlife Refuge (ANWR). While this was not a challenge to the President Obama executive order per se, ANWR being on a coastal plain and not offshore, it too was symbolic, as demonstrated by the fact that nothing happened for the rest of his term. Until now.
The Bureau of Land Management (BLM) is rushing through the process of posting tracts for leasing*. The timeline they are following is much shorter than normal, reportedly to receive the bids prior to the presidential handover. A rushed process will not properly match up tracts with buyer priorities. This alone does not auger well for a heavily subscribed sale. But the primary reason for tepid interest is the market. Oil prices are low and uncertain. Gas prices are low and certain. This must be about oil alone. Anybody buying ANWR leases in gas prone areas would have a whole lot of explaining to do.
The 2017 Bureau of Ocean Management lease sale in the Gulf of Mexico was strong. Oil prices were low then and nearly as uncertain as now, not counting the Covid-19 induced depression. An important measure of oil company stock value is reserves replacement. Loosely, it is the ratio of new reserves booked to oil produced. So, they are always looking to find new reserves. But the Gulf of Mexico is well understood, with considerable seismic data and production experience. In comparison, the Arctic has a paucity of seismic data. And the results to date have been daunting. Shell walked away from the Arctic after spending a reported USD 7 billion, with not much to show for it. Admittedly, that was offshore in the Chukchi Sea, and ANWR would be different. But the hurdles will remain numerous. Environmental sensitivity and wells made expensive in part by the limited window of time for operations are just two. Financing will be hard to come by and legal challenges are almost certain to at least delay operations. This leaves just the very large oil companies with strong balance sheets and equally strong stomachs. The European headquartered ones, especially BP and Royal Dutch Shell, are very unlikely to plan an Arctic venture.
The need to increase reserves could be addressed in an unconventional manner: increased focus on tertiary recovery of oil. Simply put, conventional means usually can extract only about a third of the oil in place in the reservoir. Tertiary recovery utilizes CO2to wring more out. The gas is introduced into the formation in a supercritical state. Supercritical CO2 enters pores effectively as if in a gas state but reacts with the oil as if in a liquid state. Being twice as soluble in oil as in water causes preferential combination with oil. This mixture is pushed to the producing well due to high pressure in the injection well. The CO2is separated and re-injected. This process is repeated and each time more of the CO2 remains in the ground. Eventually, about 95% of the CO2 is retained in the formation and subject to the same trap mechanisms that kept the original hydrocarbons in place.
Some estimates have that by this method 50 billion barrels could be recoverable in the US at a USD 70 price per barrel of oil. By comparison, ANWR is estimated to hold 15 billion barrels, and I doubt those are economical at USD 70 oil price. If the CO2 were to be priced at USD 50 per tonne, the amount needed would roughly cost USD 11 per barrel of oil recovered. Since the finding, development and much of the lifting cost has already been incurred, this ought to be affordable at USD 70 oil. Part of the current 45Q Federal Tax Credit, which is USD 35 per tonne CO2 for such sequestration, would also be available. Since this is paid to the CO2 producer, who, together with the USD 50 paid by the oil company, would have USD 85 available for capture. This is right in the range of feasibility of the latest capture technology.
A recent life cycle study concludes that a barrel of oil from tertiary recovery represents a 37% CO2 emissions reduction compared to oil from conventional production (see reference below), provided the CO2 is sourced industrially. Arctic operations will, of course, have an additional layer of environmental risk associated with their location.
In some senses, here then is the choice for oil company strategists. On the one hand ANWR, costly and environmentally risky due to the location. Data paucity adds a layer of exploration and development risk. Uncertainty in the price of oil renders profitability in doubt. On the other hand, tertiary recovery using industrially sourced CO2. Negligible operational risk and you already own the lease. At scale it makes a serious dent in industrial CO2 emissions, the marginal costs are low per barrel of oil produced, and the surface environmental footprint is much smaller compared to new areas explored and produced.
Reason dictates that the Arctic ought to remain icy cold.
*Dream on in “Dream On” by Aerosmith (1973) written by Steven Tyler
December 6, 2020
Núñez-López V and Moskal E (2019) Potential of CO2-EOR for Near-Term Decarbonization. Front. Clim. 1:5. doi: 10.3389/fclim.2019.00005
November 30, 2020 § 2 Comments
A recent New York Time story discusses an EPA report of diesel pickup truck owners disabling emission controls on their vehicles. The scale of the deception is reported as being the emissions equivalent of putting nine million extra trucks on the road. There appears to be a cottage industry of small companies devising schemes to manipulate the emission controls systems. The term cottage industry likely does not do it justice when one realizes that, according to the EPA, since 2009 more than half a million trucks have been involved. Yet, each of the providers is usually a small player. In 2019 the EPA acted against 48 companies, bringing their decadal total to 248.
The motivation for these actions is not to despoil the environment. It is to improve fuel efficiency and/or engine performance such as (higher) torque. Diesel vehicles are singularly (dis)credited with high production of NOx and particulate matter (compared to gasoline vehicles). In the case of particulate matter (PM), this comparison is strictly valid only when the measure is micrograms per cubic meter and not particle count. But that is a hair to be split another time. Mass based measurement is the only regulatory approach at present.
PM from diesel engines is captured in Diesel Particulate Filters (DPF). After a couple of hundred kilometers of operation, the filter starts clogging. This causes back pressure on the engine and reduces engine performance. Engine manufacturers set back pressure limits at which the filter must be regenerated. This is accomplished by running the engine hot, which produces more NO2, which catalytically oxidizes the carbon deposits. But if the engine duty cycle simply does not achieve these temperatures, the filter will remain clogged. Sometime after, it will cease to function and will need to be replaced. This is a clear example where circumventing the DPF benefits the consumer in performance and avoidance of the cost of premature replacement of the DPF. Enter small player who replaces the DPF with a straight pipe for a fee.
NOx is short for some combination of NO2 and NO. It is harmful to human health directly, and indirectly through ozone formation. Ozone is formed when NOx reacts with organic molecules and sunlight in the atmosphere. Ozone, often erroneously equated to smog, is a component of smog and injurious to health, with children especially vulnerable. NOx is produced when nitrogen in the combustion air is oxidized. Higher engine temperatures create more NOx. The high temperatures needed to regenerate DPFs have this undesirable effect. This is an example of the balancing act required in emission controls. One remedy for this particular trade-off, present in some automobiles, is an auxiliary heating element in the DPF, which fires up at appropriate intervals.
NOx control is addressed in one of two ways. The more foolproof one is Selective Catalytic Reduction (SCR), in which urea is injected into the exhaust. Urea decomposes into ammonia (in two possible steps), which catalytically reacts with NOx to reduce it to nitrogen. Engine performance is unaffected. But the urea canister must be replaced periodically, which consumers see as a nuisance. The system occupies space, so this tends to be in bigger engines (over 3 L displacement).
The alternative, more targeted to smaller engines, is the Lean NOx Trap. Here, the NOx is captured by adsorption onto a surface. Desorption (removal from the surface) is achieved by chemical reduction by exhaust gas components such as uncombusted diesel, carbon monoxide and hydrogen. In an efficiently running engine, these will, by design, be in short supply. Accordingly, the control system deliberately causes the engine to run rich (oxygen starved) to produce these reducing gases. While this achieves the purpose of reactivating the NOx capture, during these intervals, the engine runs with lower power and is fuel inefficient. The fuel efficiency reduction overall is in the vicinity of 3 – 5 %. As a frame of reference, this is the reduction gasoline engines see by the addition of 10 – 15% ethanol in the gasoline, and there is no consumer push back, perhaps because it is not accompanied by a performance reduction. Yet, in diesel pickup trucks, it appears some combination of that and loss of torque is responsible for the public buying avoidance schemes.
Avoidance schemes are known in the industry as “defeat devices”. The practice is so rife, it has a name. Vehicle makers have in the past cut these corners, the most infamous being the VW cheating episode. But this piece is about aftermarket interventions, which are harder to corral, in part due to the sheer number of companies involved. This proliferation is due to the ease of devising the defeat devices. While every state has somewhat different methods for emissions testing, a common one is to simply query the onboard computer for clues on functioning of the emissions system. Altering the onboard programs is enough to produce the avoidance. In other cases, the emissions are tested directly while on a chassis dynamometer (think stationary bike). With such testing, the avoidance must be more sophisticated, as it was in the VW case. Then there is the simple substitution of the SCR device with a tube. A mechanic with a reasonable modicum of competence could execute this.
The defeat devices cottage industry would not exist were there not a market for the product. The pickup truck owners simply want the performance* and may well consider the infraction to be minor. Curiously, they would not be too wrong on the DPF avoidance. The regeneration step described above is combustion of the filter cake and does produce carbon particles. However, the general public is unlikely to be aware of the intricacies of operation of the DPF and the impressions are probably grounded in the beliefs of the truck owner community.
Although more than likely aware of the illegality, truck owners are probably aware that individuals are never targeted by the federal government. In many ways, this is more pain-free than a speeding infraction.
*I can’t get no satisfaction from “(I can’t get no) Satisfaction” by the Rolling Stones (1965), written by M. Jagger and K. Richards
November 30, 2020
November 19, 2020 § 2 Comments
Less than a week apart, two leading Covid-19 vaccines were announced as being more than 90% effective (Federal approval requires greater than just 50%). These numbers auger well for broad scale combat of the virus. Does this mean that a slew of other vaccines in development are also likely to emulate this success? Not necessarily. Depends on the approaches the others take. These two use a relatively new approach which took less than a year from when they began to the seeming end of a Phase 3 trial, albeit limited in size; unheard of speed in the firmament of vaccines.
Conventional vaccines introduce inactivated pathogens which trigger an immune response. Being inactivated by heat or chemical means, they cannot cause the disease. But getting them to a safe while effective formulation takes time. The immune system retains the memory of this invasion (acquired immunity) and when a true disease pathogen is detected, activates the defense mechanism. Acquired immunity is retained for some period. This is also the reason that acquiring the disease can provide immunity for years in many cases. Edward Jenner, the English physician responsible for smallpox vaccination, is credited with coining the term vaccination, derived from the Latin vacca for cow and, vaccinia for cowpox. Famously, infection with the relatively benign cowpox conferred immunity from the deadly smallpox disease.
SARS COV-2, the virus creating the Covid-19 disease, was sequenced in early January 2020 by Chinese doctors. This was made available worldwide. The virus is about 120 nano meters in size and has an external lipid layer. Thrusting through the lipid layer are spikes, known as the spike protein (see image).
SARS-CoV-2 transmission electron microscopy image, courtesy NIAID-RML
This protein was an obvious target for the vaccine. In a conventional vaccine it would be only one of many proteins eliciting antibodies. To focus on just the spike protein, investigators turned to a messenger RNA (mRNA) approach. As the name implies, mRNA conveys the genetic message to the cell to direct synthesis of a specific protein. This specificity likely minimizes the possibility of side effects.
Here is how mRNA-based vaccines work. Investigators devised an mRNA strand which could reliably direct the production of the spike protein when injected into a cell. This protein is known as an antigen. The antigen will collect on the periphery of the cell, referred to as the antigen-presenting cell. The antigens on the cell elicit an immune response (primary response) from the body. This activates cells which acquire a memory for this detection and response in what are known as memory cells. When the person is infected with the SARS COV-2 virus, and the pathogen presents, the memory cells cause the immune response to kill the virus.
Instability of the mRNA strand is a concern at many stages in the process. It will not survive long in the body, so ensuring cell entry requires that it be encapsulated in some way. The most common means is a lipid (essentially fat) capsule. Entry into the cell is facilitated and the lipid capsule preserves it for the duration prior to that. Once in the cell, the mRNA strand directs the production of the antigen and then degrades and is no longer a factor in the body.
In lab settings, mRNA not tailored to be more stable requires storage at -70 C. Most laboratories employing these methods have such freezers available. When the mRNA needs to be used for experiments, it is often temporarily stored on the bench top in containers with liquid nitrogen or even dry ice (solid CO2). But long-distance transport is still an issue, especially in low- and middle-income countries (LMICs). Thus, distribution to LMICs is more complicated than simply the cost of the vaccine. These countries are best served by vaccines that are stable at closer to ambient temperature. mRNA can be designed to more thermally stable, especially through improved encapsulation. Currently, we know that the Pfizer/BioNTech vaccine requires the conventional -70 C for storage. The Moderna vaccine is stated to be stable at -20 C, and for some duration at refrigerator temperatures up to +5 C.
A dark horse that I have not seen discussed in the press is CureVac, a German firm (as is BioNTech) in Tübingen. It is expected to commence Phase 3 trials at the end of this year, whereas the other two are already seeking emergency approval. What makes CureVac distinctive is that their emphasis has always been on thermal stability. This characteristic is what triggered a (pre Covid-19) USD 52 million investment in 2015 from the Bill and Melinda Gates Foundation, which is known to be particularly interested in health solutions for LMICs. In fact, the investment agreement requires low cost availability of vaccines in LMICs. CureVac claims 3-month stability at +5 C and 24 hours at room temperature. The tortoise could overtake the hares, at least in LMICs.
Break on through from “Break on Through (to the other side)”, written and performed by The Doors (1967).
November 19, 2020
October 22, 2020 § 8 Comments
The future of oil has been debated for ever since I can remember. When I was an undergraduate in engineering in the early sixties, we were taught that the world would run out of oil in 30 years. Such predictions continued with the concept of Peak Oil oft discussed. But, with the recognition of immense heavy oil reserves, and more recently with the emergence of shale oil, the discussion has shifted to the demand side.
For nearly a century all crystal ball gazing centered on sufficiency of a strategic commodity. Over the last decade or so, oil is well on its way to turning into salt*. Lest you conjure alchemical imagery, I hasten to explain that oil is merely going the way of salt. Salt used to be a strategic commodity. Canning, and later refrigeration turned it into a useful commodity, no longer strategic. This was about the time that the era of oil began, with the discovery of Spindletop and the resultant decimation of the price of oil. The era was abetted by the demand created by mass production of economical cars by Ford, which incidentally killed the auto industry of the time: electric cars. More on the revenge later.
But the demise of oil will be preceded by a protracted hospice stay. Folks will predict X% electric cars by year Y. But that will be for new vehicles. Legacy vehicles will go a long time, especially in countries like India, a major developing market for automobiles. The electric starter was first installed in a Cadillac in 1911. I was still hand cranking our venerable Morris 8 sedan in India (with difficulty; I was 6) in 1950. On the other side of the coin, India is more amenable to conversions to electric drive, in part due to low labor cost and in part due to a way of life that wrings out every drop of value in a capital asset.
The future of oil is now being discussed relative to demand, not so much supply. Peak oil discussions are replaced by peak consumption ones. Shale oil put paid to the supply issue. Even before Covid-19 destroyed demand, a groundswell of movement was present towards oil alternatives for transportation fuel. This was driven by climate change concerns, but also to a degree by emissions such as NOx and particulate matter. But the projections on future demand depend on the tint of the glasses worn. The Organization of Petroleum Exporting Countries (OPEC) is predicting return to pre-Covid levels of consumption by late next year. Somewhat surprisingly, the US Energy Information Administration is also singing that tune as are some oil majors such as ExxonMobil.
Most surprisingly, however, British Petroleum (BP) is very bearish. Their projections, while being scenario based, are causing them to plan a 40% reduction in their oil output by 2030. This is to be combined with a big uptick in renewable electricity production. Shares rose on the announcement. But BP has been contrarian before, along the same lines. Over a dozen years ago they announced a pronounced shift away from oil, renaming BP to stand for Beyond Petroleum. That did not go well. Particularly unhelpful to their reputation for operating in difficult environments was the oil spill associated with the massive Macondo blow out.
The future of oil is not the future of natural gas. Together they share the term petroleum, although it is imprecisely used in the parlance to stand simply for oil. They were both formed in the same way, with natural gas being the most thermally mature state of the original organisms. But in usage they are different. Oil is mostly about transport fuel and natural gas is mostly about fuel for electricity generation and the manufacture of petrochemicals, especially plastics.
The pandemic decimated transportation fuel but had much smaller effects on electricity and less again on plastics. In the post pandemic world, natural gas will endure for long, while oil will be displaced steadily by liquids from natural gas and biogas, and ultimately by electricity. This, of course, excludes aircraft, which will need jet fuel for the foreseeable future. Biomass derived jet fuel will be a consideration, but not likely a big factor.
Electric vehicle batteries costing USD 100 per kWh will be the tipping point, and we are close. At that level, the overall electric vehicle with modest range will cost about the same as a conventional one. The battery and electric motors’ cost will be offset by the removal of the IC engine, gear box, transmission, exhaust systems and the like. For a compact car, each 100 miles in range will add about USD 2500 to 3000 to the capital cost. Maintenance costs will plummet and the fuel cost per mile will be significantly less than with gasoline or diesel. To top it off, the linear torque profile typical of electric motors enables high acceleration from a stop. A progressive shift is inevitable. The revenge of the electric car.
The only debatable issue is the rate of change. And this is where the opacity appears in the future of oil. The main sticky bits are perceptions of range required (and the willingness to pay for more) and charging infrastructure. The latter could be influenced by business model innovation, such as battery swapping rather than owning. But oil is here to stay for decades. Therefore, improvement in efficiency, to reduce emissions per mile, are paramount. The industry appears to understand that. When the US administration announced a drastic relaxation of mileage standards in 2025, four major companies voluntarily agreed to a standard close to the old one. I suspect this was in part because they already had worked out the techno-economics to get there, and certainly the consumer would like the better mileage. Could be also that they had projections of electric vehicle sales that allowed fleet averages to be met. A compact electric vehicle has a gasoline equivalence mileage of about 120. Quite an offset with even a modest fleet fraction.
The oil barrel has sprung a leak. But it is likely a slow one.
October 22, 2020
*Turning Oil into Salt, Anne Korin and Gal Luft, 2009, Booksurge Publishing
October 8, 2020 § Leave a comment
Several readers of my wildfire blog suggested that I had given lightning strikes short shrift as causes of forest fires. In fact, lightning as a cause is not even mentioned because it is not in the top four causes, at least until 2015, as reported in the the paper I used, which is cited at the end of this piece. They plot the proportion of number of fires and area burned by cause of ignition in (a) the Santa Monica Mountains and (b) San Diego County. I am not reproducing the plot here for copyright reasons but interested folks can go to the link in the citation (Syphard and Keeley 2015) below.
Most such studies concentrate more on the area burned, than numbers of ignition, presumably because that is the net effect on the public and on the environment. Lightning as a cause is present in…
View original post 655 more words
September 21, 2020 § 2 Comments
California is ablaze. So are Oregon and Washington. The tally to date is 5 million acres burned, about halfway through the fire season, and well on its way to record territory. Putting that in perspective, the east coast of Australia, devastated similarly earlier this year in the Southern Hemisphere summer, closed the season with 46 million acres burned.
The statistic of greatest concern is that the intensity and scale of the fires is getting worse. Over the last thirty years, the number of fires annually has no discernible trend; certainly, has not gone up. But the acreage burned has; decisively. Both patterns are evident in the figure below. Five of the ten largest fires ever in California are currently active. The largest of these, the August Complex is already at 839,000 acres and still going. The next largest, ever, was 459,000 acres, the Mendocino Complex in 2018. Labeling any of this chance, or poor forestry management, evokes imagery of the proverbial ostrich, and the placement of its head.
The average hectares (a hectare is roughly 2.47 acres) burned has nearly doubled over this three-decade period. Nine of the ten largest fires have occurred since the year 2000. Note that this does not include the ongoing five, which certainly would be in that group, making it 14 of the 15 since 2000. Although a regression line would have uncertainty due to big annual swings, an eyeball estimate indicates a strong upward slope. If this is a predictor of the future, that future is indeed bleak and warrants a study of causes.
The recent EPA report, from which the figure was reproduced, ascribes the pattern of increased fire acreage to higher temperatures, drought, early snow melts and historically high fuel loading (which is the fire prone vegetation, including underbrush). We will examine these separately, although they may not be disconnected. But first, a comment on the pattern of numbers of fires being essentially flat. Ignition events determine numbers of fires. In California, the principal ones are arson, campfires, power lines and equipment. The equipment category comprises items such as power saws, mowers, and other operated machinery. Human behavior, absent intervention, can be expected to be constant. So, the flat profile on numbers of fires is to be expected. Interestingly, the incidences are seasonal, even, counter-intuitively, arson.
Climate change is implicated in many of the causes of increasing severity over the years. While the term has many interpretations, one generally accepted aspect is temperature rise in the atmosphere and in the oceans. The debate is not whether this happens, but how fast it does. Also generally accepted (to the extent any climate change causality is generally accepted) is that oceanic temperature rise causes increased severity in the El Niño phenomenon in the Pacific Ocean, which is responsible for catastrophic droughts. These are accompanied by drenching rains in other parts of the world in the same year. Both disturbances are extreme deviations from the norm, with resultant impact on vegetation and the way of life.
Atmospheric temperature rise can also be expected to change the proportion of rain and snow in precipitation. Lighter snowfall can be a result, as also early snow melts. Both are in the EPA list noted above.
California, being generally arid, gets most of its water supply from melting snow. While less snow in a given year is certainly a drought indicator, the phenomenon most studied is that of the timing of the snow melt. Data from four decades commencing in 1973 conclusively demonstrated that burn acreage was strongly correlated with the earliness of the snow melt (Westerling 2016). Decadal comparisons show that the fire seasons in 2003-2012 averaged 40 more days that the seasons in 1973-1982. Fires in the later decade were more severe. Large fires, defined as covering greater than 400 hectares, burned for 6 days on average in the early decade and for more than 50 days in 2003-2012.
Power lines deserve special mention. Falling power lines were blamed for several fires in 2017 and 2018. The utility has accepted blame and is in bankruptcy. Trees falling on power lines snapped the poles. The tree roots, finding uncertain purchase due to drought conditions, were no match for the Santa Ana winds or any other storm sourced shoves. Those same drought conditions caused the underbrush to be dry. Power lines are usually not insulated. Sparking wires on dry underbrush and the rest is incendiary history. A poster child for distributed power.
The wildfire future is indeed bleak. Climate change retardation is necessary. But it may not be sufficient in the shorter term. We need a reincarnation of Smoky to change human behavior to minimize the ignition events.
Westerling, A. L. (2016) ‘Increasing western US forest wildfire activity: sensitivity to changes in the timing of spring’, Philosophical Transactions of the Royal Society B: Biological Sciences, 371: 20150178. http://dx.doi.org/10.1098/rstb.2015.0178
Vikram Rao September 21, 2020