Advanced SMRs: No Fuss, (Almost) No Muss
December 27, 2024 § Leave a comment
The potentially catastrophic condition that a nuclear reactor can encounter is overheating leading to melt down of the core. Conventional reactors need active human or automatic control intervention. These can go wrong, as they did in the 3 Mile Island accident. Small modular (nuclear) reactors (SMRs) are designed to share the trait of passive cooling down (automatic, without intervention) in the event of an upset condition. SMR designs to achieve this control differ, but all fall in the class of intrinsically safe, to use terminology from another discipline. This is the no fuss part.
The muss, which is harder to deal with, entails the acquisition and use of fissile nuclei (nuclei which can sustain a fission reaction), and then the disposition of the spent fuel. Civilian reactors use natural uranium enriched in fissile U-235 to up to 20%. At concentrations greater than that, theoretically a bomb could be constructed. The most common variant, the pressurized water reactor (PWR), uses 3 to 6% enrichment. Sourcing enriched uranium is another issue. Currently, Russia supplies over 35% of this commodity to the world. The US invented the technology but imports most of its requirement.
In all PWRs and most other reactors, nearly 90% of the energy is still left unused in the spent fuel (fuel in which the active element is reduced to impractical concentrations) in the form of radioactive reaction products. Recycling could recover the values, but France is the only country doing that. The US prohibited that until a few decades ago, for fear that the plutonium produced could fall into the wrong hands. Geological storage is considered the preferred method but runs into local opposition at the proposed sites, although an underground site in Finland is ready and open for business.
One class of reactors that defers the disposal problem, potentially for decades, is the breeder reactor. The concept is to convert a stable nucleus such as natural uranium (U-238) or relatively abundant thorium (Th-232) to fissile Pu-239 or U-233, respectively. The principal allure, beyond the low frequency of disposal, is that essentially all the mineral is utilized without expensive enrichment. In both cases, the fuel being transported is more benign, in not being fissile. One variant uses spent fuel as the raw material for fission. The reactor is the recycling means.
At a recent CERA Week event, Bill Gates drew attention to TerraPower, an SMR company that he founded. For the Natrium (Latin for sodium) offering, which combines the original TerraPower Traveling Wave Technology (TWR) with that of GE Hitachi, the coolant is liquid sodium (they are working on another concept which will not be discussed here). Using molten metal as a coolant may appear strange, but the technical advantage is the high heat capacity. The efficacy of this means was proven as long ago as 1984, when in the sodium-cooled Experimental Breeder Reactor-II at Idaho National Laboratory, all pumps were shut down, as was the power. Convection in the molten metal shut down the reactor in minutes. That reactor operated for 30 years. So, that aspect of the technology is well proven. TerraPower’s 345 MWe Natrium reactor, which broke ground in Wyoming earlier in 2024, is not technically a breeder reactor, although it utilizes fast neutrons, which is helped by the coolant being Na, which slows neutrons down less than does water (the coolant in PWRs). Natrium uses uranium enriched to up to 19% as fuel.
Natrium has two additional distinguishing features. The thermal storage medium is a nitrate molten salt, another proven technology in applications such as solar thermal power, where it is an important attribute to provide power when the sun is not shining. For an SMR, the utility would be in pairing with intermittent renewables to fill the gaps. Their business model appears to be to deliver firm power with a rated capacity of 345 MWe and use the storage feature to deliver as much as 500 MWe for over 5 hours. In general, the unit could be load following, meaning that it delivers in sync with the demand at any given time.
The most distinctive feature of the Natrium design is that the nuclear portion and all else, including power generation, are physically separated on different “islands”. This is feasible in part because the design has the heat from the molten sodium transferred by non-contact means to the molten salt, which is then radiation free when pumped to the power generation island. The separation of nuclear and non-nuclear construction ought to result in reduced erection (and demobilization) time and cost. Of course, sodium-cooled reactors are inherently less costly because they operate at ambient pressures, and the reactor walls can be thinner than they would be for an equivalent PWR.
The separation of the power production from the reactor ought also to lend itself to the reactor being placed underground and less susceptible to mischief. This is especially feasible because fuel replacement ought not be required for decades. This last is the (almost) no muss feature. Disclaimer: to my knowledge, TerraPower has not indicated they will use the underground installation feature.
The “almost” qualifier in the “no muss” is in part because, while the fuel is benign for transport, the neutrons for reacting the U-238 are most easily created using some U-235. Think pilot light for a burner. Natrium uses uranium enriched to 16-19% U-235. However, as expected for a fast reactor, more of the charge is burnt. Natrium reportedly produces 72% less waste. These details support the fact that, their other attributes notwithstanding, SMRs do produce spent fuel for disposal although with less frequency in some concepts, especially breeders, and this is the other reason for the “almost” qualifier.
As in all breeders, no matter what the starting fuel is, additional fuel could in principle be depleted uranium. This is the uranium left over after removal of the U-235, and it is very weakly radioactive. Nearly a ton of it was used in each of the old Boeing 747s for counterweights in the back-up stabilization systems. It was also used (probably still is) in anti-tank missiles because the pyrophoricity of U caused a friction induced fire inside the tank cabin after penetration. Apologies for the ghastly imagery, but war is hell.
Advanced SMRs could play an important role in decarbonization of the grid. My personal favorites are those that use thorium as fuel, such as the ThorCon variant which they are launching in Indonesia. Thorium is safe to transport, relatively abundant in countries such as India, and the fission products do not contain plutonium, thus avoiding the risk of nuclear weapon proliferation.
As in most targets of value, we must follow the principle of “all of the above”*.
Vikram Rao, December 26, 2024
*All together now, from All Together Now, by The Beatles (1969), written by Lennon-McCartney
AI Will Delay the Greening of Industry
October 31, 2024 § Leave a comment
Artificial Intelligence (AI), and its most recent avatar, generative AI, holds promise for industrial efficiency. Few will doubt that premise. How much, how soon, may well be debated. But not whether. In the midst of the euphoria, especially the exploding market cap of Nvidia, the computational lynchpin, lurks an uncomfortable truth. Well, maybe not truth, but certainly a firmly supportable view: this development will delay the decarbonization of industries, especially clean energy alternatives such as hydrogen, and the so-called hard to abate commodities, steel and cement.
The basic argument is simple. Generative AI (Gen AI) is a power hog. The same query made on a conventional search uses nearly 10 times the energy as when it is using Gen AI. This presumption is premised upon an estimate made on an early ChatGPT model, wherein the energy used was 2.9 watt hours for the same query which used 0.3 watt hours on a Google search. The usage gets worse when images and video are involved. However, these numbers will improve, for both categories. Evidence for that is that just over a decade ago, data centers were the concern in energy usage. Dire predictions were made regarding swamping of the electricity grid. The power consumption in data centers was 194 terrawatt hours (tWh) in 2010. In 2018 it was 205 tWh, a mere 6% increase, despite the compute instances increasing by 550% (Massanet et al, 20201). The improvement was both in computing efficiency and power management.
More of that will certainly occur. Nvidia, the foremost chipmaker for these applications claims dramatic reductions in energy use in forthcoming products. The US Department of Energy is encouraging the use of low-grade heat recovered from cooling the data centers. A point of clarification on terminology: the cloud has similar functionality as a data center. The difference is that data centers are often physically linked to an enterprise, whereas the cloud is in a remote location serving all comers. We use the terms interchangeably here. Low-grade heat is loosely defined as heat at temperatures too low for conventional power generation. However they may be suitable for a process such as desalination with membrane distillation, and the regeneration of solvents used in direct air capture of carbon dioxide.
Impact on De-carbonizing Industry
The obvious positive impact will be on balancing the grid. The principal carbon-free sources of electricity are solar and wind. Each of these is highly variable in output, with capacity factors (the time spent by the capital in generating revenue) less than 30% and 40%, respectively. The gaps need filling, each gap filler with its vagaries. AI will undoubtedly be highly influential in optimizing the usage from all sources.
The obverse side of that coin is the increasing demand for electricity by the data centers supporting AI of all flavors. The Virginia based consulting company ICF predicts usage increasing by 9% annually from the present to 2028. Many data center owners have announced plans for all energy used being carbon-free by 2030. Carbon-free electricity capacity additions are primarily in solar and wind, and each of these requires temporal gap filling. Longer duration gaps (over 10 hours) are dominantly filled by natural gas generators. A major effort is needed in enabling the scaling of carbon-free gap fillers, the most viable of which are innovative storage systems (including hydrogen), advanced geothermal systems and small modular reactors (SMRs).
The big players in cloud computing have recognized this. Google is enabling scaling by purchasing power from the leading geothermal player Fervo Energy and is also doing the same with Hermes SMRs made by Kairos Power. An interesting twist on the latter is that the Hermes SMR is an advanced reactor of the class known as pebble bed reactors, using molten salt cooling (as opposed to water in conventional commercial reactors). It uses a unique fuel that is contained in spheres known as pebbles. The reaction products are retained in the pebble through a hard coating. This is not the place to discuss the pros and cons of the TRISO fuel used, except to note that it utilizes highly enriched uranium, much of which is currently imported from Russia. Google explicitly underlines part of the motivation being to encourage scaling SMRs. This is exactly what is needed2, especially for SMRs, whose promise of lower cost electricity is largely premised upon economies of mass production replacing economies of scale of the plant.
Microsoft has taken the unusual (in my view) step of contracting to take the full production from a planned recommissioning of the only functional reactor at the Three Mile Island (conventional) nuclear reactor facility. In my view, conventional reactors are passe and the future is in SMRs. The most recent new conventional ones commissioned are two in Georgia. The original budget more than doubled, and the plants were delayed by 7 years. Par for the course for nuclear plants. Microsoft is certainly aware of the importance of SMRs, in part because founder Bill Gates is backing TerraPower, using an advanced design breeder reactor with liquid sodium as the coolant, and molten salt for storage. The “breeder” feature3 involves creation of fissile Pu239 by neutrons from an initial core of enriched U colliding with U238 in the surrounding depleted U containment. The reactions are self-sustaining, requiring no additional enriched U. The operation can be designed to operate over 50 years without intervention. Accordingly, it could be underground. The design has not yet been permitted by the NRC but holds exceptional promise because the fuel is essentially depleted uranium (ore from which fissile U has been extracted) and the issue of disposal of spent fuel is avoided.
Impact on Green Hydrogen, Steel and Cement
Hydrogen as a storage medium, alternative fuel, and feedstock for green ammonia, has a lot of traction. One of the principal sources of green hydrogen is electrolysis of water. But it is green only if the electricity used is carbon-free.
Similarly, steel and cement are seeking to go green because collectively they represent about 18% of CO2 emissions. Cement is produced by calcining limestone, and each tonne of cement produced causes about a tonne of CO2 emissions. Nearly half of that is from the fuel used. Electric heated kilns are proposed, using carbon-free electricity. Similarly, each tonne of steel produced causes emission of nearly 2 tonne CO2. A leading means for reduction of these emissions is the use of hydrogen to reduce the iron oxide to iron, instead of coke. Again, the hydrogen would need to be green and only high-grade iron ore is suitable, and it is in short supply worldwide. A recent innovation drawing considerable investor interest is electrolytic iron production. This can use low-grade ore. But for the steel to be carbon-free, the electricity used must be as well.
The world is increasingly electrifying. It runs the gamut from electrification of transportation to crypto currency to decarbonization of industrial processes. All these either require, or aspire to have, carbon-free sources of electricity. Now, AI and its Gen AI variant is adding a heavy and increasing demand. Many of these share a common trait: the need for electricity 24/7/365. In recognition of the temporal variability in solar and wind sources, the big players are opting for firm carbon-free sources such as geothermal and SMRs. That is the good news, because they will enable scaling of a nascent industry. The not so good news for all the rest is that these folks have deep pockets and are tying up supply with contracts. Remains to be seen how a startup in green ammonia or steel will compete for carbon-free electricity.
AI could well push innovation in industrial de-carbonization to non-electrolytic processes.
Vikram Rao
October 31, 2024
1 Massanet et al. 2020 Recalibrating global data center energy-use estimates. Science, 367(6481), 984–986.
2,3 pages 53 and 12 https://www.rti.org/rti-press-publication/carbon-free-power
Fukushima and Beyond
April 27, 2011 § 1 Comment

Source: Areva
This is fairly representative of the discussions at the Breakfast Forum held on April 21
The Fukushima Daiichi disaster is now classed with Chernobyl in magnitude. It is clear that the single most significant cause of the radiation leakage was the power blackout. The cascade of events began with the tsunami, which breached the wall. Tsunamis are classed by the height of the wave upon shore impact. This one was judged to be at 46 ft and the breach occurred because the seawall was designed for 19 ft. Two previous tsunamis, in 1933 and 1896, had reportedly been substantially higher than this one. The numbers themselves may be somewhat suspect because these measurements are not standardized, and certainly were not a century ago. Nevertheless, the seawall appears to have been under designed.
The flooding caused what is known as a station blackout. No power of any sort. The primary reason for this appears to be that the backup diesel generators were in the basement, and got flooded. The reactor had in fact been shut down well before the wave hit. But nuclear reactors continue to generate heat after shut down, although this progressively decreases. Cooling water is required, in the absence of which the fuel melts and releases radioactivity. Due to extended power failure the core sustained damage. The resulting reaction with water produced free hydrogen. The systems designed to prevent explosion of this hydrogen did not function, in part due to the lack of power.
The power loss was the single weakest link in the chain. The significance of this to the US situation is that spent fuel is stored in a water-cooled environment at 80 locations and these are largely already at full capacity. This is occasioned by the policy decision to not store at Yucca Mountain. So, while tsunami mediated flooding is not likely on the mainland, power loss due to other reasons, including sabotage, cannot be ignored.
Policy Implications: Each new disaster increases the understanding of some aspect of the cause of catastrophic events. Corrective actions are taken and the next one likely has a different root cause. This does raise a question regarding the futility of attempting to prevent Black Swan or Perfect Storm (take your pick on popular descriptors) events. The generally held belief is that these are results of combinations of very low probability events. So further decreasing the probability of each is possibly not terribly productive. But doing nothing in the face of calamity is simply not an option. Besides, in the case of Macondo, certain measures have emerged which definitely will improve overall safety. In this one, a clear finding with broad scale implications is that of the loss of cooling and the implications to back up power and other safeguards. The applicability to spent fuel storage is direct and decidedly important. Were it not for this incident, the public certainly would have seen the spent fuel repository debate as relegated to the cognoscenti. Now they can be communicated to with simplicity and accuracy on the risks of distributed storage. Maybe, just maybe, this particular ostrich will raise its head out of the sand and we will get a national plan that makes economic and environmental sense, and yet is responsive to the nuclear proliferation concerns.
The time span between major energy related incidents was 25 years in this case and 30 years for deepwater oil spills. One cannot help but wonder whether complacency is a factor. Certainly in the case of the Macondo spill, industry experts have acknowledged this as a factor. One theory advanced in our Breakfast Forum was that of worker training and longevity on the job. The Navy was cited as an organization that deals with reactors routinely but does so with a workforce that is exceedingly well-trained and with strict safeguards. The energy industry tends to be cyclic and profit motivation is definitely in play, as repeatedly alleged with regard to BP in the Macondo incident. Regulation, preferably self imposed by industry, could address the matter.
Of interest is the observation to date that the earthquake per se did not damage the reactor. This despite the fact that at 9 on the Richter scale, this is one of largest ever recorded. California can take some measure of comfort, one assumes.
Consequences to Power Production: Germany has already reacted to withdraw the permit to extend the life of about 8 reactors, which is a reversal of an earlier decision. Switzerland is stopping issuance of new permits. Any country that takes measures such as these is left with few choices. The principal one is natural gas generation. In the case of Germany, this means increased reliance on Russian gas or LNG imports. Also, replacing nuclear with gas creates a carbon deficit. All previous scenarios for carbon mitigation relied heavily on nuclear as a zero carbon source. To the extent that this is seriously compromised, the low carbon future targets are even greater jeopardy.
Wind is the leading candidate to provide a carbon credit. It is closer than solar in achieving parity with conventional sources. One feature in its favor is that it is highly modular; production can commence quickly even as more capacity is being added, provided the delivery infrastructure is in place. Aside from the fact it too is buffeted by environmental and aesthetic opposition, the diurnal aspect appears to limit the total contribution in a given area. Many believe the cap to be around 20% but credible studies supporting a particular number are not in evidence. Also, we don’t know the extent to which new storage measures and the smart grid could ameliorate this drawback.
The nuclear option faces an interesting dilemma. Already burdened by high capital costs and long lead time to first production, the additional risk is bound to increase the discount rate. This will increase the investment even more. Purely from the standpoint of economics, extending the permit life of existing reactors, albeit with improvements, will be the economically driven choice. The irony here is that this will perpetuate older, and presumably somewhat less safe, technology.
All energy comes with a price. It is a question of choice. You can’t leave it, so best learn to love it.
Fukushima Fallout
April 12, 2011 § 1 Comment
Some preliminary thoughts as prelude to our upcoming Breakfast Forum
The Fukushima Daiichi disaster will undoubtedly have a marked effect on the energy policies of nations. There is something about nuclear fission accidents that evokes strong fears out of proportion with the actual threat to human well-being. People with anti-nuclear views will be emboldened, such as what happened in Germany.
Consider the German situation – A significant move away from nuclear is only possible with massive new natural gas based capacity. This will apply elsewhere as well as discussed later. Natural gas replacing coal gives a net improvement in carbon emissions. Decidedly not so when replacing nuclear. So, carbon mitigation targets will have to be met in other ways. The country has already placed a big bet on solar. But with programmed reductions in subsidies, the future is increasingly cloudy. The true elephant in the room is Russian gas. Further reliance on gas for power means increased reliance on either Russia or LNG imports.
An LNG Revival: If one builds on the premise that in the short term, a nuclear future will at least be rendered bleaker, the only fast response alternative is natural gas. Coal has a longer lead time and makes the carbon emissions situation decidedly worse, unless carbon sequestration is accomplished. A scant five years ago a massive shift from nuclear to gas would have been untenable from the standpoint of a price explosion brought on by the spike in demand. Today we know that U.S. gas supplies are abundant and LNG originally destined for the U.S. may now be directed to countries such as Germany. Japan itself, although seemingly committed to a strong nuclear future, will be a big purchaser of LNG in the short term.
The sudden draw on natural gas supplies could have interesting consequences. As we previously posited, U.S. natural gas prices will stay in a band between $4 and $6.50, with excursions to $8 for decades due to the unique attributes of shale gas. The demand increase discussed is unlikely to materially change that. But, gas price in Europe and Japan, to name just two, will undoubtedly see a sustained uptick. U.S. gas interests will therefore find a lucrative LNG export business hard to pass up. While production costs are not as low as in Qatar or Iran, the demand will likely support all sources. Also, western companies constructing LNG trains will be winners.
European shale gas exploitation will also pick up. The importance of this resource to reduce reliance on Russia just escalated. We can also foresee increased efforts to exploit those conventional gas resources which are currently dormant due to high carbon dioxide (for example in Malaysia), nitrogen (for example in Saudi Arabia) or hydrogen sulfide. All of these require improvements in technology.
Effect on Renewables: Despite the initial flight to gas, the net effect on renewables will be positive, provided the world continues to believe that global warming due to carbon emissions is a concern. This is primarily because the replacement of nuclear with gas has a negative effect on carbon emissions and means to ameliorate will be ever more important. The need for this will put increasing pressure on the enablers such as effective storage. In the near term, wind should be the winner because it is closer than solar to parity with conventional production costs. So a massive scale up is feasible but is hampered by the diurnality. Analysts believe that some wind heavy parts of Europe are maxed out. A greater fraction from wind appears not easy to assimilate. Smarter grids allowing for better load leveling and cost effective storage will take on greater urgency. An interesting possibility is that distributed power, including combined heat and power, may acquire greater currency. Policies governing utilities will need adjustment.
In fairly short order the Macondo oil spill and the Fukushima Daiichi disaster have brought into focus the downsides to two major sources of energy. In each case, the reactions have been peremptory and the voices against offshore drilling and nuclear energy loud. The nuclear substitute of shale gas has organized opposition on environmental grounds. Wind is buffeted by aesthetic arguments. Lost in the rhetoric is the realization that it is always going to be about choice; picking one’s poison as it were.
Energy: we can’t live without it so we must learn to live with it.