MIT Natural Gas Report Glosses Over Environmental Issues

July 1, 2010 § 1 Comment

MIT’s most recent report on energy is on the Future of Natural Gas, following similar reports on coal and nuclear energy.  It is co-edited by Ernest Moniz and Tony Meggs.  The latter recently left BP as CTO.  As reported in Forbes recently, the report emphasizes the role of shale gas in enabling natural gas substitution of coal.  The authors see this as a transitional strategy for a low carbon future.  We agree with that and have expressed similar ideas in the Directors Blog.

However, the report is surprisingly shy about discussing the environmental issues seen as facing shale gas exploitation.  While we believe these are indeed tractable, they merit much more discussion than they were given.  Accordingly we repair some of that omission here.

The most significant issues center on three matters:  fresh water withdrawals, flow back water and collateral issues, and produced water handling and disposal.

Fresh Water Withdrawals and Flow Back Water:   Typical wells use between 3 and 5 million gallons per well.  Industry practice has been to use fresh water as the base for fracturing fluid.  The water that returns to the surface after the fracturing step is known as flow back water.  Shale operations are unique in that only about a quarter to a third of the water returns, the rest staying in the formation.  Also, the flow back water is usually more saline than the injected water.  So, in principle it cannot be re-used.

Handling salinity is the first step to water conservation.  The key is ability of the fracture water to tolerate some level of chlorides.  Recent research has shown that not only is this possible, but that it can be beneficial.  The chlorides actually stabilize the clay constituents of the shale and improve production, although companion chemicals such as friction reducers need to be modified.  This has two possible implications to water withdrawals.  One is that after some measure of treatment, the flow back water should be usable.  But because all of it does not return, withdrawals for make-up water will be necessary.  This is where the second implication comes in.  Moderately saline water from another source could be used since salinity is tolerable.  The most important implication of the foregoing is that flow back water could over time be completely re-used and this then ceases to be an issue with respect to discharge. 

So, now let us discuss numbers.  In current practice the tolerance for chlorides is likely about 40,000 ppm.  Flow back water with higher salinity will need to be desalinated to some degree, or diluted by fresh water.  In some parts of the country this may be viable.  Another option could well be to use sea water, if that were to be the water of convenience.  Sea water tends to contain around 30,000 ppm chlorides.  That is already in the range of acceptability with the possible removal of some minor constituents.  Finally saline aquifers are a potential source.  These are in great abundance, with variable salinities.  Saline water wells drilled as companion to the gas wells are very likely in areas where fresh water withdrawals compete with agriculture or other endeavors.  In general, if the shale gas industry can utilize water unsuited to agriculture and human consumption, then it will be seen in a completely different light.

Produced Water

Water associated with the gas is produced at some stage of the recovery, usually towards the end of hydrocarbon production.  In some cases early production occurs due to infiltration of the fractures into the underlying saline water body often present.  Whether from connate water or the water layers below, produced water will be very saline, in part because of the age of the rock.  Disposal of this water is a major issue, especially in New York and Pennsylvania and can cost upwards of $10 per barrel, when even possible.  Concern regarding illegal discharge is high among the residents.

The treatment of produced water represents a significant business opportunity.  Several outfits are developing forward and reverse osmosis schemes for desalination.  Others are working on bacteria eradication, heavy metal removal and the like, using methods such as membrane filtration and ion exchange.  Some of these are already in service on a limited basis.

Produced water offers the promise of being usable for make-up water after some modest treatment.  The salinity may be directly tolerable but the bacteria would need to be removed prior to re-use.  This is because many of these cause the production of hydrogen sulfide downhole, which makes the gas less valuable and causes corrosion in the equipment. 

Contamination of Drinking Water

There have been anecdotal reports of well water contamination by gas, most recently sensationalized by a documentary.  The popular literature ascribes two hypotheses to this phenomenon.  One is the migration of fracturing operation cracks from the reservoir up to the water body.  The other is gas leakage from the well.

Hydraulic fracture cracks will not propagate the significant distances to the aquifers.  Were they inclined to do so, they would heal due to the earth closure stresses.  In terms of distance, the closest fresh water aquifers are about 5000 ft. and 3000 ft. away, respectively, for the Barnett and the Marcellus.  So this really is not likely.

Gas leakage from the well is preventable if the well is drilled and completed correctly.  A fundamental feature of regulation has always been to design for isolation of fresh water in all petroleum exploitation, not just in the shale.  Between the produced fluids and the aquifer lie two layers of steel encased in cement.  The cementing operation is designed for preventing fluid migration.  Tests are run to ensure competence of the cement job and remedies are available for shortcomings.  At these shallow depths the operation is extremely straightforward and amenable to regulatory oversight.

See Also: New York Times’ response to the study

A case for decision science research in energy

March 16, 2010 § Leave a comment

A sustainable low carbon future is seen by most to center around breakthroughs in technology and the associated economics.  Most of the attention has been on carbon sequestration, biofuels, renewable sources of electricity and the like.  A number of states and countries have instituted policies to make some of these happen.  Many also see electrification of transportation as an avenue to zero emission vehicles and energy security of net oil importing nations.  All of these cause people to make choices, in many cases requiring changes in behavior.  Introducers of technology know that the barrier to wide scale adoption is particularly high when it involves substitution of something familiar.   The science of why people make the decisions they do, especially those involving green alternatives, merits further investigation, if for no other reason than that it may guide product and process development into areas with higher success rates of adoption.  It will undoubtedly be effective in informing on policy.  An example is in the area of solar energy.  If the primary driver for adoption is “seen as being green”, then hiding photo voltaic devices inside shingles would be counterproductive, as also the policy of many neighborhoods to disallow visible displays of solar panels on homes.

The International Energy Agency (IEA) has posited that for any reasonable 2050 targets for atmospheric carbon dioxide nearly 40% of the mitigation has to be from energy efficiency.  Their most recent forecast calls for 57% of carbon mitigation by 2030 as being from energy efficiency (and interestingly only 10% from carbon sequestration).  Undoubtedly this will in large measure be accomplished with engineering designs that provide the same utility for less energy. This has been the case with up to 90% reduction in standby power of household appliances through the simple expedient of low energy power supplies and modified circuitry.  Since standby power constitutes 10% or so of all electricity usage in IEA countries, this is a huge gain.  The Energy Star and similar efforts have produced further results, although some of these fall in a different bucket, that of the same utility at a somewhat greater price.  In the case of compact fluorescent bulbs, the initial price is higher but the life cycle cost is lower.  Now this begins to get into the realm of decision science because the consumer is required to understand and appreciate life cycle costing.  We are firmly in it for cases where the costs are substantially higher, as in the case of hybrid vehicles. Electric cars will get squarely into the behavioral arena from the standpoint of range anxiety, which is roughly defined as the fear of running out of charge.

Electrification of transportation is an RTEC priority because we see it as the fastest route to energy security through making electricity fungible with oil.  Furthermore, well to wheel efficiency of electric cars is about 45% better than that of conventional cars and the tail pipe emissions are zero, although the burden is shifted to the power producer, where it is more tractable.  Consequently, enabling the public’s acceptance of electric cars is an RTEC priority.

Addressing range anxiety and other behaviors falls at least in part in the area of decision science.  Some of it can be addressed with technology.   For example, Nissan’s introduction of the Leaf later this year will be accompanied by features such as remote monitoring of the state of charge of the battery and driver notification, including identification of the nearest charging station.  But in most instances, technical advances only take us so far.  When smart electricity meters are installed in homes, there is high variability in the manner in which the data are used by the homeowner.  Behavioral studies are needed to guide the programs to achieve the best results.  Non price interventions that rely on behavioral proclivities, such as conformance to societal norms, can likely be used to advantage.

In their matrix of program thrusts, DOE’s newly formed unit ARPAe has a matrix element that intersects social science efforts with transportation.  RTEC believes that this could be a fruitful area of pursuit for RTI/Duke/UNC collaboration.  One possible project would combine conventional survey based approaches with behavioral economics ones in addressing the electric car range problem.  At this time this is based on guesswork premised upon beliefs regarding consumer preferences when driving conventional cars.  Statements such as “the consumer expects a range of 300 miles” are rife.  A definitive study of driving distances in metropolitan areas that are initial target of electric vehicle entry could then be used to devise behavioral studies, the results of which could be expected to drive out interventions, both price based and not.  To aid this, the original study would be broken out by age, income and other relevant demographics. Finally, the interventions themselves could be tested on a population.

The foregoing notwithstanding, RTEC believes that the greatest gains for society in the realm of sustainable energy are going to come from simply using less.  Consequently, a major focus will be to encourage and assist members in devising social science based research with this goal in mind.

Can North Carolina be a domestic source for lithium for electric vehicle batteries?

February 14, 2009 § Leave a comment

Making transport fuel fungible with electricity offers options to net importers of oil such as the US.  As a state, North Carolina is in the unenviable position of importing all of its fuel from other states.  While biofuel will undoubtedly play a role in reducing this import, electrifying the fleet offers another avenue.  The primary mission of electric vehicles(EV’s) would be the reduction or elimination of tail pipe emissions, the notoriously most difficult site for carbon dioxide capture, although a secondary one may be to act as a storage medium for the grid.  The FRDM program, led by NC State University, targets creating all elements of a Smart Grid, which would be a key vehicle in grid optimization.  So, North Carolina is already well placed to take a lead in electrifying the passenger vehicle fleet.

EV’s such as GM’s Plug-in Hybrid (PHEV), the Volt, scheduled to be marketed in 2010, are intended to be charged in conventional electrical outlets, with a gasoline engine for charging the batteries if needed to go beyond the nominal range, 40 miles in the case of the Volt.  Pure EV’s, running solely on electricity, such as one scheduled by Nissan for limited entry in 2010, are also likely to be part of the equation.  If such vehicles are to become a substantial portion of the passenger vehicle fleet, several economic hurdles will have to be crossed, some possibly needing subsidies.  The principal of these is the expected higher cost of the vehicle (pure EV’s, because of their simplicity of design, will be somewhat lower in cost than PHEV’s), driven largely by the cost of the battery.  Research to reduce cost and increase range is ongoing in this and other countries, and the current administration has announced the intent to significantly fund this endeavor as part of the Stimulus Package.

Batteries: The Lithium Ion battery is the clear leader in this field and many believe it will continue to be so for the foreseeable future.  Other manner of sophistication, such as augmentation with super capacitors for short  bursts of power, is expected to reduce the load on the batteries.  However, the current unit costs are high, although high volume throughput has not yet been in place.  One can expect the costs to come down over time.  A point of note is that while the technology is domestic in many cases, all battery manufacture is currently in other low labor cost countries.  However, as in the case of foreign designed cars, domestic manufacture may become feasible.  Location of such capability in North Carolina would go hand in hand with any decision to make North Carolina a primary launch state for electric vehicles.

Lithium: A more pernicious issue is the sourcing of the critical commodity, Lithium.  World reserves are considerable, but the majority of these are in Latin America, including some countries such as Bolivia who are not in close alignment with the US.  There is the risk of trading foreign dependency of one commodity for another.  Unlike the battery manufacturing situation, a mineral is uniquely situated, as in the case oil.  North America does have sizeable reserves of lithium ore, in the form of spodumene, an oxide, but with current technology the processing costs are high when compared to the cost of processing the brine based deposits in other countries.  The vast majority of spodumene reserves in this country are in North Carolina, in an area northwest of Charlotte.

Call for Action: The technology for spodumene processing deemed non economic is at least half a century old.  Hints exist in the literature for more innovative methods.  In the national interest a research program should be instituted to investigate the possibility of economic recovery of Lithium from oxide ore.  RTEC has commenced a scoping exercise in this area, currently involving a literature search, but a fully fledged investigation will require State or Federal funding.

Flexi-Fuel Fairy Tale

December 11, 2008 § Leave a comment

The Utopian State, known the world over as the US, was in the throes of a dilemma.  Much maligned for not doing enough to limit carbon dioxide emissions, it developed a plan that seemingly in one fell swoop tackled global warming associated with automobile emissions while at the same time reducing import of oil from nations, some of whom were deemed unfriendly, at least in the rhetoric of elections.

This solution was known as the 20/10 plan.  The goal, to replace 20 percent of gasoline with ethanol in 10 years, was seen as visionary, if for no other reason as that 20/10 was about as good as one got with vision.  However, even before vast quantities of alcohol had been consumed, a hangover of major proportions was in the making.  Therein lies the tale.

The Utopian State, as befitted its name, was inclined to believe that the public would recognize a really good thing when they saw it.  They especially believed in the maxim: If You Build it, They Will Come, because said maxim was irresistibly derived from the powerful combination of Kevin Costner, the National Sport and mysticism.

So they built it, a complex web of subsidies to farmers, automobile companies and refiners, and tariffs on imported ethanol, all designed to produce domestic ethanol to blend with gasoline, and vehicles that would run on the stuff.  In a nod to perceived consumer preferences, they incentivized the auto companies to make flexi-fuel cars, capable of using regular gasoline and also E85, a blend with 85% ethanol.

They even created demand for these cars by ordering their agencies to use them and mandating the use of the new fuel.  Waivers to the mandate were given generously, no doubt in the Utopian belief that said waivers would not be sought if not merited.  It seems that some of these outfits are seeing a net increase in gasoline usage (Washington Post: Problems Plague U.S Flex-Fuel Fleet, Oct. 23, 2008), a result contributing in no small measure to the aforementioned hangover.

At the core of Utopian belief is that folks will “do the right thing.”  So, purchasers of flexi-fuel vehicles were expected to purchase E85, even from filling stations some distance away, ignoring the fuel consumption getting there and back.  Then word filtered through that E85 delivered 28 percent fewer miles per gallon.  In short, it was more expensive to use and harder to find.  They started filling up with regular gasoline because the flexi-fuel vehicle allowed that; filling stations noted the drop in volume and stopped stocking E85.

In time, it became apparent that the federal policy and legislation underestimated, or ignored, the fact that even in the US only market-based policies function.  Into this nightmare scenario stepped in Prof. Wunderbahr from a prestigious eastern university, with an engine design that delivered a small car running  on E85, delivering fuel economy and the muscle of a larger vehicle.  The design took advantage of the high octane number of ethanol (113 versus 87 for regular gasoline), which allowed effectively high compression ratios, which in turn improved the efficiency of combustion.  The result was elimination of the gas mileage penalty from using ethanol, increased power for an engine of given size, and retention of the improved emissions associated with ethanol usage.

Auto makers vied with each other to retool and produce these cars without any federal incentive because the public actually wanted them.  Fuel distributors rushed to install E85 pumps and realized that this was simply achieved by eliminating one grade of fuel.  They came to the realization that all vehicles on the road today specify either 87 or 91 octane.  A third grade was not needed, and the third pump was now available with modification to dispense E85.  The US government, not wanting to be left out of this, set policies to further these steps.  Ethanol from sources non competitive with agriculture became cheaply available.  All was well again.

And then they elected a new President who resolved never again to set policy that was not market-based.  The country united behind him on this and it was never quite the same again.  The country was henceforth known as the United States.

Where Am I?

You are currently browsing entries tagged with carbon at Research Triangle Energy Consortium.