A small start-up company just launched an amazing new product to much fanfare: a novel fuel cell device capable of running on natural gas and potentially small enough to fit in your basement and power your entire home, replacing the electricity you buy from the grid. That proposition looks so compelling that builders will start incorporating them into new houses, so that the cost of the fuel cell would be absorbed into the mortgage a buyer takes out, making monthly power bills a thing of the past; you just pay your mortgage and your gas bill. Would it surprise you to learn that I'm not describing the "Bloom Box" fuel cell featured on last Sunday's "60 Minutes", but rather a home fuel cell designed by a small company in upstate New York called Plug Power Inc. in the late 1990s? Plug Power is still in business, and they still sell fuel cells of various sizes, including a home model, but the revolution in distributed power that their device was expected to launch hasn't occurred, at least not yet. The reasons why might shed some light on the hype surrounding Bloom Energy.
The late-90s' arrival of residential fuel cells that I mentioned above was a development that intrigued me in my professional capacity as a strategist and scenario planner for Texaco, Inc. Small fuel cells looked like a clever way to circumvent grid bottlenecks and reliability problems, using a platform that might eventually allow them to be built more cheaply and require less maintenance than either micro-turbines or the gasoline or diesel generators that dominated the small generator market. They also had the potential to increase the size of that market tremendously. (Rooftop solar was another attractive distributed power option, but without lots of expensive storage it wasn't and still isn't a recipe for 24x7 independence from the grid.)
I'm sure there are many explanations for the failure of home fuel cell sales to take off then or subsequently, including the high cost of the units, which was partly driven by the precious metals requirement of the Proton Exchange Membranes at the heart of these small fuel cells, which were similar to those being developed for cars. Bloom may have cracked this part of the puzzle by using lower-cost raw materials and choosing solid oxide fuel cell technology that can run directly on more complex fuels like methane, rather than requiring the fuel source first to be reformed into pure hydrogen--a step that adds to investment and operating costs and consumes some of the energy in the fuel, reducing overall efficiency.
Another key element of the economics of fuel cells relates to their operating costs, chiefly fuel. This was a problem for Plug and it remains a problem for Bloom, particularly at the residential level. While industrial users and commercial sites can negotiate gas supply contracts at competitive long-term rates that should allow cost-effective power production onsite, residential customers pay somewhat more and are exposed to significant seasonal and annual price volatility--much more than on electricity rates. Through November the average US residential natural gas price last year was $12.86 per thousand cubic feet (MCF). That's close to the weighted average I paid last year of $12.46, which was quite a bit lower than the $15.55 I paid in 2008, thanks to lower gas commodity prices. Based on that price and knowing the unit's "heat rate"--the amount of gas required for each kilowatt-hour (kWh) produced--I can calculate the fuel cost of power. At the stated 6,610 BTU/kWh, and using last year's US average residential gas price, that works out to $0.085/kWh. So even if the device were free, that's the least I'd have paid for electricity coming out of it last year. If you live in California or Long Island, that's pretty cheap power. However, if you live somewhere like Virginia, where my average electricity rate last year was just under $0.12/kWh, all-in, the savings would be much smaller. At just under 10,000 kWh per year of usage, that would have saved me about $340, setting a pretty low upper limit on what I'd be willing to pay for a Bloom Box, even after factoring in the various federal and state tax credits available.
Now, we can argue all of the benefits of producing your own power, particularly if you live in an area subject to power outages during storms or heavy snow. Self-sufficiency is an appealing idea for many. And there's clearly an emissions benefit here; just how large depends on your local generating mix. At 0.77 lb. of CO2 per kWh the Bloom Box beats the national average by about 40%, though it's hardly on par with rooftop solar or residential wind--a singularly expensive distributed energy technology--or indeed with what your regional grid emits if it includes a high proportion of hydro or nuclear power. Potential purchasers of Bloom Boxes will need to assess what such attributes are worth to them.
The enthusiasm that surrounds a new (or at least new-seeming) technology such as this is understandable, and I can't help being infected by it to some degree. At a minimum, it reminds me of how jazzed I was about these possibilities the first time I encountered them more than a decade ago. However, for Bloom and other small fuel cell suppliers to fulfill that potential, a lot of things will have to break their way, including moving rapidly down the cost curve to make these devices as cheap as possible, as well as some good luck concerning the overall economy, and particularly the housing market, especially its new-construction segment. Meanwhile, if the price of rooftop solar continues to fall, fuel cells could face stiff competition, while restrictions on the production of shale gas could boost natural gas prices and thus the net cost of electricity from a home fuel cell. I'll be watching Bloom Energy's progress with great interest as they attempt to develop this market.
Providing useful insights and making the complex world of energy more accessible, from an experienced industry professional. A service of GSW Strategy Group, LLC.
Thursday, February 25, 2010
Tuesday, February 23, 2010
Shale Gas and Drinking Water
Life is full of unintended consequences, and the energy industry is currently dealing with a significant one related to the step-change in US natural gas reserves and production made possible by exploiting gas resources locked up in deposits of a sedimentary rock called shale. The very success of these efforts has placed a decades-old, widely-used drilling technique called "hydraulic fracturing" at the center of a major controversy. In fact, it's hard to find references to fracturing (often called "frac'ing" or "fracking") that don't describe it as a "controversial drilling practice." As best I can tell from delving into the technology involved, the controversy around fracking is largely an artificial one, though that hasn't deterred Congress from holding hearings on it or introducing legislation to regulate it further at the federal level, on top of the state level where it already appears currently well-regulated.
I should preface my comments on fracking by pointing out that I haven't had any direct experience with the practice, either during my time at Texaco or in my studies of chemical engineering, a field that overlaps petroleum engineering extensively, though not in the specifics of this subject. My analysis and conclusions are the result of some research and a lengthy conversation with a former mentor who knows more about fracking from first-hand experience than most of us ever will.
The main concerns about fracking today involve its potential risk to our supplies of drinking water and the adequacy of current regulations to address this. Understanding whether these concerns are justified requires knowing a bit about how fracking works, as well as where drinking water comes from. I could fill up several postings exploring each of those topics, but for the purposes of this discussion let's take a quick look at one of the shale regions at the heart of this controversy, the Marcellus Shale in the Appalachian region of New York, Pennsylvania and the Virginias. In the course of my research I ran across a handy document on groundwater from Penn State. Aside from surface water (lakes, rivers and streams), it identifies the various aquifers in Pennsylvania by type in Figure 4. The key fact from the perspective of fracking safety is that the deepest of these aquifers lies no more than about 500 ft. below the surface, and typically less than a couple of hundred feet down. By contrast, the Marcellus Shale is found thousands of feet down--in many areas more than a mile below-ground--with a thickness of 250 feet or less. In addition, the gas-bearing layers are sealed in by impermeable rock, or the gas would eventually have migrated somewhere else. In other words, the shale gas reservoirs are isolated by geology and depth from the shallower layers where our underground drinking water is found.
Now consider what happens during drilling. As illustrated in this video from the American Petroleum Institute, the drill must go through the layers that might connect to a drinking water source on its way to the gas-prone shale far below. However, before the deeper horizontal portions of the well are fractured to create fissures in the shale through which the gas can flow, the vertical well is cased in steel pipe and cemented to the rock. This, by the way, is already required by law, and it seals off any possible connection with a drinking-water aquifer before the first gallon of fracturing fluid is pumped into the well. That fluid is mainly water, plus a few chemicals, such as surfactants (detergent) and gel to carry the sand used to prop open the fractured fissures. Some of that water remains in the reservoir--isolated from drinking water--and most of it is returned to the surface where it is captured for treatment and either disposal or re-use in another fracking job. As long as the well was completed in accordance with standard practices, the primary risk to water supplies is from surface activities that are already thoroughly regulated and have been for years. Accidental contamination of surface or groundwater would be handled by the appropriate authorities, and a driller would be liable for any damages.
The more I learned about fracking, the more puzzled I became that it has attracted so much criticism recently. After all, the practice was developed in the late 1940s and has been used since then in hundreds of thousands of wells to produce literally billions of barrels of domestic oil and trillions of cubic feet of domestic natural gas. That wouldn't be the case if this were some new, risky practice. In fact, it is an entirely mainstream industry practice that has become so vital to the ongoing production of oil & gas from the highly-mature resources of the United States that a study by Global Insight suggested that restrictions on fracking could cut US gas production by anywhere from 10-50% within this decade, depending on their severity. Similar consequences for oil production would follow. The only thing new here is the clever application of fracking with state-of-the-art horizontal drilling to shale reservoirs that couldn't economically produce useful quantities of gas without them.
The fracking controversy also involves a surprising irony: While many of us recall the old cliché about oil and water not mixing, it turns out that oil, natural gas and water are often found together deep underground--and this is not drinking water I'm talking about. Water is also routinely injected into producing oil & gas wells, either as liquid or as steam, in order to enhance recovery, and many wells produce a lot more water than oil. As a result, the oil & gas industry handles staggering volumes of water every day. By comparison fracking, in which water is only used to prepare a well and is not part of the ongoing production process, accounts for just a tiny fraction of the industry's involvement with water--all already regulated, I might add.
So how do we explain the current ruckus over hydraulic fracturing? Perhaps one reason this old practice is attracting new scrutiny is because it's being applied in parts of the country that haven't seen a drilling rig in decades, where it provokes a similar reaction to the arrival of 300-ft. wind turbines, utility-scale solar arrays, and long-distance transmission lines. But rather than just writing this off as yet another manifestation of NIMBY, I'm truly sympathetic to concerns about the integrity of our drinking water. My family drinks water out of the tap, and I would be irate if I thought we were being exposed to something dangerous. When you examine the science behind fracking and see that, if anything, these wells are drilled and isolated with more care than many water wells (which I understand often aren't cased and cemented to protect the water source from contact with other sedimentary layers) it becomes clear that the biggest potential exposure occurs not underground but at the surface, where fracking is just one of many other regulated industrial water uses, and a fairly small one at that. Thus, whether intentionally or as a result of a basic misunderstanding of how this technology works, we are being presented with a false dichotomy concerning shale gas and fracking. The real choice here isn't between energy and drinking water, as critics imply, but between tapping an abundant source of lower-emission domestic energy and what looked like a perpetually-increasing reliance on imported natural gas just a few years ago.
I should preface my comments on fracking by pointing out that I haven't had any direct experience with the practice, either during my time at Texaco or in my studies of chemical engineering, a field that overlaps petroleum engineering extensively, though not in the specifics of this subject. My analysis and conclusions are the result of some research and a lengthy conversation with a former mentor who knows more about fracking from first-hand experience than most of us ever will.
The main concerns about fracking today involve its potential risk to our supplies of drinking water and the adequacy of current regulations to address this. Understanding whether these concerns are justified requires knowing a bit about how fracking works, as well as where drinking water comes from. I could fill up several postings exploring each of those topics, but for the purposes of this discussion let's take a quick look at one of the shale regions at the heart of this controversy, the Marcellus Shale in the Appalachian region of New York, Pennsylvania and the Virginias. In the course of my research I ran across a handy document on groundwater from Penn State. Aside from surface water (lakes, rivers and streams), it identifies the various aquifers in Pennsylvania by type in Figure 4. The key fact from the perspective of fracking safety is that the deepest of these aquifers lies no more than about 500 ft. below the surface, and typically less than a couple of hundred feet down. By contrast, the Marcellus Shale is found thousands of feet down--in many areas more than a mile below-ground--with a thickness of 250 feet or less. In addition, the gas-bearing layers are sealed in by impermeable rock, or the gas would eventually have migrated somewhere else. In other words, the shale gas reservoirs are isolated by geology and depth from the shallower layers where our underground drinking water is found.
Now consider what happens during drilling. As illustrated in this video from the American Petroleum Institute, the drill must go through the layers that might connect to a drinking water source on its way to the gas-prone shale far below. However, before the deeper horizontal portions of the well are fractured to create fissures in the shale through which the gas can flow, the vertical well is cased in steel pipe and cemented to the rock. This, by the way, is already required by law, and it seals off any possible connection with a drinking-water aquifer before the first gallon of fracturing fluid is pumped into the well. That fluid is mainly water, plus a few chemicals, such as surfactants (detergent) and gel to carry the sand used to prop open the fractured fissures. Some of that water remains in the reservoir--isolated from drinking water--and most of it is returned to the surface where it is captured for treatment and either disposal or re-use in another fracking job. As long as the well was completed in accordance with standard practices, the primary risk to water supplies is from surface activities that are already thoroughly regulated and have been for years. Accidental contamination of surface or groundwater would be handled by the appropriate authorities, and a driller would be liable for any damages.
The more I learned about fracking, the more puzzled I became that it has attracted so much criticism recently. After all, the practice was developed in the late 1940s and has been used since then in hundreds of thousands of wells to produce literally billions of barrels of domestic oil and trillions of cubic feet of domestic natural gas. That wouldn't be the case if this were some new, risky practice. In fact, it is an entirely mainstream industry practice that has become so vital to the ongoing production of oil & gas from the highly-mature resources of the United States that a study by Global Insight suggested that restrictions on fracking could cut US gas production by anywhere from 10-50% within this decade, depending on their severity. Similar consequences for oil production would follow. The only thing new here is the clever application of fracking with state-of-the-art horizontal drilling to shale reservoirs that couldn't economically produce useful quantities of gas without them.
The fracking controversy also involves a surprising irony: While many of us recall the old cliché about oil and water not mixing, it turns out that oil, natural gas and water are often found together deep underground--and this is not drinking water I'm talking about. Water is also routinely injected into producing oil & gas wells, either as liquid or as steam, in order to enhance recovery, and many wells produce a lot more water than oil. As a result, the oil & gas industry handles staggering volumes of water every day. By comparison fracking, in which water is only used to prepare a well and is not part of the ongoing production process, accounts for just a tiny fraction of the industry's involvement with water--all already regulated, I might add.
So how do we explain the current ruckus over hydraulic fracturing? Perhaps one reason this old practice is attracting new scrutiny is because it's being applied in parts of the country that haven't seen a drilling rig in decades, where it provokes a similar reaction to the arrival of 300-ft. wind turbines, utility-scale solar arrays, and long-distance transmission lines. But rather than just writing this off as yet another manifestation of NIMBY, I'm truly sympathetic to concerns about the integrity of our drinking water. My family drinks water out of the tap, and I would be irate if I thought we were being exposed to something dangerous. When you examine the science behind fracking and see that, if anything, these wells are drilled and isolated with more care than many water wells (which I understand often aren't cased and cemented to protect the water source from contact with other sedimentary layers) it becomes clear that the biggest potential exposure occurs not underground but at the surface, where fracking is just one of many other regulated industrial water uses, and a fairly small one at that. Thus, whether intentionally or as a result of a basic misunderstanding of how this technology works, we are being presented with a false dichotomy concerning shale gas and fracking. The real choice here isn't between energy and drinking water, as critics imply, but between tapping an abundant source of lower-emission domestic energy and what looked like a perpetually-increasing reliance on imported natural gas just a few years ago.
Labels:
frac'ing,
fracking,
hydraulic fracturing,
lng,
natural gas
Thursday, February 18, 2010
The Challenge of Scale
This morning's Wall St. Journal featured a front-page article on small-scale nuclear power, highlighting how reactors a tenth the size of current commercial designs could significantly reduce the financial risks associated with these mega-projects. This is one example of the need to think in new ways about scale when addressing our energy challenges. In his talk at this year's TED conference in Long Beach, Bill Gates offered another surprising perspective on scale: "All the batteries we make now could store less than 10 minutes of all the energy [in the world]," he said. Framed between those two examples is the basic proposition that while solving our energy problems may require breaking them down into more manageable pieces, they must still add up to mind-numbingly stupendous sums.
According to figures from the Energy Information Agency of the Department of Energy, in 2008 the US consumed 99.3 quads of primary energy--oil, gas, coal, nuclear power, hydropower, biomass and other renewables--down from 101.6 quads the year before. A quad is one quadrillion times the quantity of energy required to raise the temperature of a pound of water by one degree Fahrenheit, where a quadrillion is 1 followed by 15 zeroes (US definition.) Can you picture that? I can't. If I convert that consumption to barrels of oil equivalent at the rate of 5.8 million BTUs each, we get a value of just over 17 billion barrels--a much more familiar unit, especially when we divide by 365 to get 47 million barrels per day. Millions are much closer to something we can grasp, and if we are familiar with energy data we know that's equivalent to a little more than half the amount of oil produced globally every day. It's still hard to picture, though, until you work out that if it were all put in one place in outer space, it would form a spherical blob roughly 800 ft. in diameter--over half as tall as the Empire State Building--and that's every day.
By comparison the daily output of a 3 MW wind turbine, converted to its energy-equivalent of oil (assuming it backs out natural gas from a gas turbine power plant) would form a ball about 7 ft. across. It would take 1,400,000 such balls to fill the big sphere. Of course we can't really compare the output of 1.4 million wind turbines to the total amount of energy we use each day, for many reasons, though it's a handy reminder of just how big the challenge is, and why building nuclear reactors in increments of 125 MW each might be a smart way to finesse this gap.
A 125 MW reactor, if it operated with the same reliability that large nuclear plants have achieved, would produce as much power every day as 125 of those 3 MW wind turbines. And while we doubtless couldn't build these reactors as fast as wind turbines, I'll bet we could add nuclear power capacity faster in these increments than with 1,200-1,500 MW reactors, because of the advantages of being able to manufacture more of each facility in a factory, rather than constructing them on-site. Even if that translated into total project timelines only half as long as for large-scale nuclear plants of the kind for which the administration just awarded federal loan guarantees, that could be worth a lot to the utilities and merchant generating companies building them. It would greatly reduce project risks of the kind that can ruin the economics of big investments--delays, cost over-runs, accidents--and that give companies' bankers and shareholder chills. These aren't the kind of risks the government is offering to defray, by the way.
Of course that doesn't make small nuclear an either/or proposition vs. large-scale nuclear, any more than wind and solar are an either/or proposition vs. oil & gas platforms or big gas-fired power plants that can operate efficiently 24/7. There's room--and need--in our national energy economy for all of these, as our energy diet shifts from a heavy reliance on fossil fuels to a lighter, more sustainable diet in the future. At the same time, it's clear that we can't fill the gap exclusively with small-scale energy sources, without a sizable contribution from sources at least as big as these small reactors. "Drill, baby, drill" only captured one aspect of this concern. More accurately, our energy policy must deliver "scale, baby, scale."
According to figures from the Energy Information Agency of the Department of Energy, in 2008 the US consumed 99.3 quads of primary energy--oil, gas, coal, nuclear power, hydropower, biomass and other renewables--down from 101.6 quads the year before. A quad is one quadrillion times the quantity of energy required to raise the temperature of a pound of water by one degree Fahrenheit, where a quadrillion is 1 followed by 15 zeroes (US definition.) Can you picture that? I can't. If I convert that consumption to barrels of oil equivalent at the rate of 5.8 million BTUs each, we get a value of just over 17 billion barrels--a much more familiar unit, especially when we divide by 365 to get 47 million barrels per day. Millions are much closer to something we can grasp, and if we are familiar with energy data we know that's equivalent to a little more than half the amount of oil produced globally every day. It's still hard to picture, though, until you work out that if it were all put in one place in outer space, it would form a spherical blob roughly 800 ft. in diameter--over half as tall as the Empire State Building--and that's every day.
By comparison the daily output of a 3 MW wind turbine, converted to its energy-equivalent of oil (assuming it backs out natural gas from a gas turbine power plant) would form a ball about 7 ft. across. It would take 1,400,000 such balls to fill the big sphere. Of course we can't really compare the output of 1.4 million wind turbines to the total amount of energy we use each day, for many reasons, though it's a handy reminder of just how big the challenge is, and why building nuclear reactors in increments of 125 MW each might be a smart way to finesse this gap.
A 125 MW reactor, if it operated with the same reliability that large nuclear plants have achieved, would produce as much power every day as 125 of those 3 MW wind turbines. And while we doubtless couldn't build these reactors as fast as wind turbines, I'll bet we could add nuclear power capacity faster in these increments than with 1,200-1,500 MW reactors, because of the advantages of being able to manufacture more of each facility in a factory, rather than constructing them on-site. Even if that translated into total project timelines only half as long as for large-scale nuclear plants of the kind for which the administration just awarded federal loan guarantees, that could be worth a lot to the utilities and merchant generating companies building them. It would greatly reduce project risks of the kind that can ruin the economics of big investments--delays, cost over-runs, accidents--and that give companies' bankers and shareholder chills. These aren't the kind of risks the government is offering to defray, by the way.
Of course that doesn't make small nuclear an either/or proposition vs. large-scale nuclear, any more than wind and solar are an either/or proposition vs. oil & gas platforms or big gas-fired power plants that can operate efficiently 24/7. There's room--and need--in our national energy economy for all of these, as our energy diet shifts from a heavy reliance on fossil fuels to a lighter, more sustainable diet in the future. At the same time, it's clear that we can't fill the gap exclusively with small-scale energy sources, without a sizable contribution from sources at least as big as these small reactors. "Drill, baby, drill" only captured one aspect of this concern. More accurately, our energy policy must deliver "scale, baby, scale."
Labels:
consumption,
energy diet,
natural gas,
offshore drilling,
oil,
renewable energy,
wind power
Tuesday, February 16, 2010
Shaken Consensus?
Since the publication of the hacked emails from the University of East Anglia's Climate Research Unit (CRU) last November, we've been inundated with news reports and opinion pieces questioning the scientific consensus behind climate change. An editorial in today's Wall St. Journal on "The Continuing Climate Meltdown" is just the latest example of this trend, following a weekend that saw the release of a remarkable BBC interview with the CRU's former director. The fact that all this coincides with a northern hemisphere winter that has deposited record snowfalls on regions that don't normally see much of the white stuff serves to reinforce the message that something is amiss with global warming theory. It has also had me wondering if I moved far enough south, as I cope with "ice dams", cabin fever, and other consequences of a pair of back-to-back blizzards in the D.C. area. While I agree that the recent revelations have given rise to an understandable wave of doubts regarding climate change, this may say more about the way that extreme climate predictions have been played up in the last several years than it does about actual climate change.
Even the most ardent adherents of the view that climate change is real, man-made to a significant extent, and extremely challenging for humanity must agree that the science supporting this perspective has had a rough couple of months--largely deserved. Whatever the "Climategate" emails said about the underlying analytical rigor of the dominant scientific interpretation of global warming, they revealed a worrying degree of defensive groupthink and gatekeeping among leading climate researchers. I'm pleased to see that an independent group has been set up to examine the practices at East Anglia-CRU, though the inquiry has already experienced controversies of its own.
Meanwhile the Intergovernmental Panel on Climate Change (IPCC), of Nobel Peace Prize fame, is under fire for incorporating unwarranted claims in its reports, including a shockingly sloppy assertion about the rate at which glaciers are disappearing. This has exposed a process that in some instances gave magazine articles and unpublished papers the same credence as peer-reviewed scientific papers in recognized journals. For all the vitriol I see directed against "climate skeptics", the climate change community should accept that these are mainly self-inflicted wounds, and that much of the current public doubt about climate change stems from the unraveling of exaggerated predictions that were expounded without a clear, accompanying explanation of the associated caveats and uncertainties, possibly to promote quicker action by governments.
In contrast, the BBC's interview with Dr. Jones is full of nuances and caveats--though hardly outright retractions, as some have characterized his remarks. I was particularly interested in his comments on the Medieval Warm Period. Although he appears not to have "told the BBC that the world may well have been warmer during medieval times than it is now," he did seem to suggest that we simply don't have sufficient data to determine whether the warming that led to the settlement of Greenland by the Vikings and the cultivation of wine grapes in England was confined to the northern hemisphere or global in extent. Instead of prompting an assumption it wasn't global, this gap in our knowledge ought to galvanize the urgent gathering and correlation of paleoclimate data--samples of the kinds of proxies used to assess temperatures before instruments to measure them (or people to read the instruments) existed. That's because this isn't a quibble over some esoteric bit of history, but a crucial gauge of just how unprecedented the warming of the past several decades has been.
Then there's the temperature data itself. Although Dr. Jones concurred that global warming since 1995 has just missed being statistically significant, the data from the CRU and similar data from NASA do show that on average the last decade was warmer than the 1990s, which were in turn warmer than the 1980s. Despite all the talk of global cooling, the last two years still easily make the top 10 list for warmest years of the last century, and global temperatures currently average about 0.8 °F warmer than in the 1970s. But that doesn't mean that there aren't problems here, as well. Dr. Jones referred the BBC to a map of the weather stations providing the global temperature data compiled by the UK's Met Office (the national weather service) and processed by the CRU. It reveals such measurements to be very dense in the developed countries of the temperate zones and quite thin on the ground--or sea--in the tropics and the high latitudes that account for much of the earth's surface. And even the historical temperature data for the US are still subject to significant revisions, as I noticed yesterday when I rechecked the comparison between 1998 and 1934 than I wrote about several years ago.
So where does this leave us? From my perspective it requires us to think about the definition of a successful scientific theory as one that provides the best explanation for the evidence we see--even if that evidence is incomplete, as seems to be the case here. The fact that some scientists seem to have behaved badly or that others--mostly non-scientists--have promoted alarming-but-uncertain predictions as proven and now have egg on their faces doesn't alter the fact that "anthropogenic global warming" (AGW) based on greenhouse gas emissions still seems to explain more of what we observe going on than any other theory at this point. Hypotheses such as the one attributing warming to the influence of cosmic rays on cloud formation must go through a great deal more vetting before supplanting AGW.
While considerable progress has been made in the last decade solidifying the evidence supporting the AGW theory, significant uncertainty still remains about the future consequences it suggests, particularly as relates to regional impacts and changes in precipitation. A lot more also needs to be done to clarify the relationship between proxy data and instrumental temperature data, and to ensure that the latter are consistent and truly representative. However, I don't see these deficiencies as justifying complete policy paralysis, particularly when it comes to those actions that can be accomplished relatively cheaply, such as improved energy efficiency, or that offer substantial benefits for other concerns such as energy security, including expanding nuclear power, low-cost renewable energy, and R&D to bring down the cost of other renewables. As for whether the time is right to pursue more comprehensive measures, there's a legitimate debate to be had, but it shouldn't start from the false assumption that anthropogenic global warming has been disproved.
Even the most ardent adherents of the view that climate change is real, man-made to a significant extent, and extremely challenging for humanity must agree that the science supporting this perspective has had a rough couple of months--largely deserved. Whatever the "Climategate" emails said about the underlying analytical rigor of the dominant scientific interpretation of global warming, they revealed a worrying degree of defensive groupthink and gatekeeping among leading climate researchers. I'm pleased to see that an independent group has been set up to examine the practices at East Anglia-CRU, though the inquiry has already experienced controversies of its own.
Meanwhile the Intergovernmental Panel on Climate Change (IPCC), of Nobel Peace Prize fame, is under fire for incorporating unwarranted claims in its reports, including a shockingly sloppy assertion about the rate at which glaciers are disappearing. This has exposed a process that in some instances gave magazine articles and unpublished papers the same credence as peer-reviewed scientific papers in recognized journals. For all the vitriol I see directed against "climate skeptics", the climate change community should accept that these are mainly self-inflicted wounds, and that much of the current public doubt about climate change stems from the unraveling of exaggerated predictions that were expounded without a clear, accompanying explanation of the associated caveats and uncertainties, possibly to promote quicker action by governments.
In contrast, the BBC's interview with Dr. Jones is full of nuances and caveats--though hardly outright retractions, as some have characterized his remarks. I was particularly interested in his comments on the Medieval Warm Period. Although he appears not to have "told the BBC that the world may well have been warmer during medieval times than it is now," he did seem to suggest that we simply don't have sufficient data to determine whether the warming that led to the settlement of Greenland by the Vikings and the cultivation of wine grapes in England was confined to the northern hemisphere or global in extent. Instead of prompting an assumption it wasn't global, this gap in our knowledge ought to galvanize the urgent gathering and correlation of paleoclimate data--samples of the kinds of proxies used to assess temperatures before instruments to measure them (or people to read the instruments) existed. That's because this isn't a quibble over some esoteric bit of history, but a crucial gauge of just how unprecedented the warming of the past several decades has been.
Then there's the temperature data itself. Although Dr. Jones concurred that global warming since 1995 has just missed being statistically significant, the data from the CRU and similar data from NASA do show that on average the last decade was warmer than the 1990s, which were in turn warmer than the 1980s. Despite all the talk of global cooling, the last two years still easily make the top 10 list for warmest years of the last century, and global temperatures currently average about 0.8 °F warmer than in the 1970s. But that doesn't mean that there aren't problems here, as well. Dr. Jones referred the BBC to a map of the weather stations providing the global temperature data compiled by the UK's Met Office (the national weather service) and processed by the CRU. It reveals such measurements to be very dense in the developed countries of the temperate zones and quite thin on the ground--or sea--in the tropics and the high latitudes that account for much of the earth's surface. And even the historical temperature data for the US are still subject to significant revisions, as I noticed yesterday when I rechecked the comparison between 1998 and 1934 than I wrote about several years ago.
So where does this leave us? From my perspective it requires us to think about the definition of a successful scientific theory as one that provides the best explanation for the evidence we see--even if that evidence is incomplete, as seems to be the case here. The fact that some scientists seem to have behaved badly or that others--mostly non-scientists--have promoted alarming-but-uncertain predictions as proven and now have egg on their faces doesn't alter the fact that "anthropogenic global warming" (AGW) based on greenhouse gas emissions still seems to explain more of what we observe going on than any other theory at this point. Hypotheses such as the one attributing warming to the influence of cosmic rays on cloud formation must go through a great deal more vetting before supplanting AGW.
While considerable progress has been made in the last decade solidifying the evidence supporting the AGW theory, significant uncertainty still remains about the future consequences it suggests, particularly as relates to regional impacts and changes in precipitation. A lot more also needs to be done to clarify the relationship between proxy data and instrumental temperature data, and to ensure that the latter are consistent and truly representative. However, I don't see these deficiencies as justifying complete policy paralysis, particularly when it comes to those actions that can be accomplished relatively cheaply, such as improved energy efficiency, or that offer substantial benefits for other concerns such as energy security, including expanding nuclear power, low-cost renewable energy, and R&D to bring down the cost of other renewables. As for whether the time is right to pursue more comprehensive measures, there's a legitimate debate to be had, but it shouldn't start from the false assumption that anthropogenic global warming has been disproved.
Labels:
climate change,
climategate,
global cooling,
global warming
Friday, February 12, 2010
Observing the Sun
The topic of space exploration has gotten much media attention lately, mainly focused on the uncertain fate of future US manned space efforts in light of the cancellation of NASA's Constellation program in the administration's new budget. After the current flight of the shuttle "Endeavor" and the four remaining shuttle missions this year, the fleet will be retired and transporting astronauts to and from the International Space Station will depend on Russia, or on unproven spacecraft from commercial start-ups like SpaceX and Blue Origin. Yet without diminishing the importance of these concerns for our long-term access to space, yesterday's delayed launch of the Solar Dynamics Observatory satellite deserved more attention than it got. The SDO mission is part of NASA's "Living with a Star" program, which is aimed at expanding our knowledge about how the sun affects life on earth, with implications for energy and our understanding of the environment, including climate change.
It's hard to think of anything we take more for granted than the Sun, yet as the material on the SDO mission website explains, we don't fully understand the variability and internal mechanics of our planet's primary source of light and heat--and thus directly or indirectly of all the energy we use except for that derived from nuclear and geothermal power. Variations in the amount of solar energy the earth receives as a result of the eccentricity of our orbit around it have long been understood to influence long-term climate patterns, including ice ages, while the impact of fluctuations due to variability in the sun's actual output remains more controversial. Climate skeptics have suggested that much of the warming of the last several decades, along with the recent temperature plateau, could be related to the approximately 11-year sunspot cycle. Meanwhile NASA scientists have assessed the impact of solar variability on climate to be significantly less than that from the accumulation of atmospheric greenhouse gases. SDO should improve our understanding of solar variability and its consequences here on earth. (I should mention that observed recent short-term variability of a few Watts per square meter isn't sufficient to have a noticeable effect on the power output of solar panels.)
The more immediate energy concern that SDO should help to clarify is the risk that currently-unpredictable solar activity, including strong solar flares and resulting geomagnetic storms, poses to power grids--smart and otherwise--and communications equipment on earth and in orbit. At the extreme, a solar flare of the magnitude of the Carrington Event of 1859 could disrupt critical energy infrastructure in much the same manner as an Electromagnetic Pulse (EMP) from a high-altitude nuclear explosion. As dependent as we all are on increasingly complex and inter-connected electrical and electronic systems, anything that improves the ability of scientists to forecast a sudden spike in solar radiation could be worth its weight in gold.
NASA's capacity to conduct missions with immediate benefits on earth, such as SDO and the forthcoming Glory mission to measure key aspects of the earth's energy balance, is crucial, but then so is building on the legacy of four decades of manned spaceflight. I have distinctly mixed feelings about the altered priorities in NASA's new budget, though I'm pleased that funding for space as a whole was preserved. The possibility that this shift will spur a vibrant private space industry that could significantly reduce the cost of reaching earth orbit is exciting, because among other things that could make large-scale space solar power practical and affordable. At the same time I worry that we shouldn't cede America's preeminent position in human space exploration at a time when other nations are setting ambitious goals in this arena.
It's hard to think of anything we take more for granted than the Sun, yet as the material on the SDO mission website explains, we don't fully understand the variability and internal mechanics of our planet's primary source of light and heat--and thus directly or indirectly of all the energy we use except for that derived from nuclear and geothermal power. Variations in the amount of solar energy the earth receives as a result of the eccentricity of our orbit around it have long been understood to influence long-term climate patterns, including ice ages, while the impact of fluctuations due to variability in the sun's actual output remains more controversial. Climate skeptics have suggested that much of the warming of the last several decades, along with the recent temperature plateau, could be related to the approximately 11-year sunspot cycle. Meanwhile NASA scientists have assessed the impact of solar variability on climate to be significantly less than that from the accumulation of atmospheric greenhouse gases. SDO should improve our understanding of solar variability and its consequences here on earth. (I should mention that observed recent short-term variability of a few Watts per square meter isn't sufficient to have a noticeable effect on the power output of solar panels.)
The more immediate energy concern that SDO should help to clarify is the risk that currently-unpredictable solar activity, including strong solar flares and resulting geomagnetic storms, poses to power grids--smart and otherwise--and communications equipment on earth and in orbit. At the extreme, a solar flare of the magnitude of the Carrington Event of 1859 could disrupt critical energy infrastructure in much the same manner as an Electromagnetic Pulse (EMP) from a high-altitude nuclear explosion. As dependent as we all are on increasingly complex and inter-connected electrical and electronic systems, anything that improves the ability of scientists to forecast a sudden spike in solar radiation could be worth its weight in gold.
NASA's capacity to conduct missions with immediate benefits on earth, such as SDO and the forthcoming Glory mission to measure key aspects of the earth's energy balance, is crucial, but then so is building on the legacy of four decades of manned spaceflight. I have distinctly mixed feelings about the altered priorities in NASA's new budget, though I'm pleased that funding for space as a whole was preserved. The possibility that this shift will spur a vibrant private space industry that could significantly reduce the cost of reaching earth orbit is exciting, because among other things that could make large-scale space solar power practical and affordable. At the same time I worry that we shouldn't cede America's preeminent position in human space exploration at a time when other nations are setting ambitious goals in this arena.
Labels:
climate change,
smart grid,
solar power,
space program,
space solar power,
ssp
Wednesday, February 10, 2010
Another Energy Bill?
When the first flakes of the second major snowstorm in less than a week began to fall on Northern Virginia, it occurred to me that I might not be in a position to post for a couple of days. I had intended a longer posting covering all the topics mentioned in a renewable energy conference call that I dialed into yesterday, but then I've written previously about most of them. The call was hosted by the American Wind Energy Association and its sister trade associations covering hydropower, biomass power, geothermal energy, and solar energy, for the purpose of laying out a joint "2010 Outlook for Renewable Energy," recommending a national renewable electricity standard (RES) along the lines of the Renewable Portfolio Standards already in place in a number of states. The groups also released a report from Navigant Consulting highlighting the green job-creation potential of such a policy.
All but one of the trade associations involved in the call are members of the larger Renewable Electricity Standard Alliance, so I wasn't surprised to hear them pushing this issue strongly. With cap & trade sidelined at least for now, there's a good deal of speculation about an energy-only compromise bill, presumably built around provisions like the RES. Much of the political popularity of the RES option relies on the fact that it could be implemented at minimal taxpayer expense. However, the real costs, which can be significant, are passed along to electricity ratepayers--though few of them would be able to spot them in their bills. I commented last spring on some practical concerns about how much new generation might be called forth in this manner in the context of the Waxman-Markey bill, which included a little-noticed RES provision. Since most of these technologies generate power on less than a full-time basis, the more ambitious the RES goal, the higher its hidden costs would tend to rise.
What I didn't hear yesterday--though perhaps due to some level of multi-tasking distraction on my part--was any mention of a preferable low-emission electricity standard that would encompass not just renewables, but also nuclear power and any other technology that could generate electricity while emitting negligible quantities of greenhouse gases on a lifecycle basis--in other words much less than a fossil fuel power plant without extremely-effective carbon capture and sequestration. Given the increased emphasis on the potential contribution of additional nuclear power since the State of the Union Address, and the priority that the likeliest Republican participants in any bi-partisan energy compromise would place on nuclear, an "LEES" seems a logical policy evolution, even if many economists consider such standards to be less efficient and ultimately more expensive than setting a price on GHGs via either cap & trade or a carbon tax.
With regard to the report highlighting the potential to create 274,000 additional renewable energy jobs through enactment of a national RES, I noted the absence of any information on the impact on the broader economy from the higher electricity rates that would accompany such an effort. In addition, I continue to believe that much of the "green jobs" emphasis misses the primary role of energy in our economy, which is not to employ as many Americans as possible producing energy, but to produce as much energy as possible for the other industries and sectors that employ most Americans. When I heard the CEO of the Solar Energy Industries Association touting solar energy as creating more jobs per unit of output than any other energy source--at least that's what I thought I heard him say--I groaned (on mute, of course.)
It's anyone's guess whether the Congress will come up with a comprehensive energy & climate bill, a stripped-down energy-only bill, or any such bill at all this year. I can only hope that if it does, it emphasizes producing (or saving) as much domestic energy, as cost-effectively as possible, and that creation of "green jobs" is not the primary policy-selection criterion. The purpose of energy legislation ought to be making the US economy as competitive as possible, and not just in clean energy as the industrial-policy fad of the moment, but in a way that will promote economic growth and job growth across the board over the long haul.
All but one of the trade associations involved in the call are members of the larger Renewable Electricity Standard Alliance, so I wasn't surprised to hear them pushing this issue strongly. With cap & trade sidelined at least for now, there's a good deal of speculation about an energy-only compromise bill, presumably built around provisions like the RES. Much of the political popularity of the RES option relies on the fact that it could be implemented at minimal taxpayer expense. However, the real costs, which can be significant, are passed along to electricity ratepayers--though few of them would be able to spot them in their bills. I commented last spring on some practical concerns about how much new generation might be called forth in this manner in the context of the Waxman-Markey bill, which included a little-noticed RES provision. Since most of these technologies generate power on less than a full-time basis, the more ambitious the RES goal, the higher its hidden costs would tend to rise.
What I didn't hear yesterday--though perhaps due to some level of multi-tasking distraction on my part--was any mention of a preferable low-emission electricity standard that would encompass not just renewables, but also nuclear power and any other technology that could generate electricity while emitting negligible quantities of greenhouse gases on a lifecycle basis--in other words much less than a fossil fuel power plant without extremely-effective carbon capture and sequestration. Given the increased emphasis on the potential contribution of additional nuclear power since the State of the Union Address, and the priority that the likeliest Republican participants in any bi-partisan energy compromise would place on nuclear, an "LEES" seems a logical policy evolution, even if many economists consider such standards to be less efficient and ultimately more expensive than setting a price on GHGs via either cap & trade or a carbon tax.
With regard to the report highlighting the potential to create 274,000 additional renewable energy jobs through enactment of a national RES, I noted the absence of any information on the impact on the broader economy from the higher electricity rates that would accompany such an effort. In addition, I continue to believe that much of the "green jobs" emphasis misses the primary role of energy in our economy, which is not to employ as many Americans as possible producing energy, but to produce as much energy as possible for the other industries and sectors that employ most Americans. When I heard the CEO of the Solar Energy Industries Association touting solar energy as creating more jobs per unit of output than any other energy source--at least that's what I thought I heard him say--I groaned (on mute, of course.)
It's anyone's guess whether the Congress will come up with a comprehensive energy & climate bill, a stripped-down energy-only bill, or any such bill at all this year. I can only hope that if it does, it emphasizes producing (or saving) as much domestic energy, as cost-effectively as possible, and that creation of "green jobs" is not the primary policy-selection criterion. The purpose of energy legislation ought to be making the US economy as competitive as possible, and not just in clean energy as the industrial-policy fad of the moment, but in a way that will promote economic growth and job growth across the board over the long haul.
Monday, February 08, 2010
Super Bowl Diesel
In addition to a pair of well-matched teams and a sufficient dose of fourth-quarter suspense concerning the outcome, yesterday's Super Bowl was the first in several years to feature an ad meriting comment in an energy blog. The subject of the ad was the new Audi A3 TDI clean diesel car, which was recently named "Green Car of the Year" for 2010. I was intrigued by the ad's tagline of "Green has never felt so right", positioning the car as painlessly green. Having had the opportunity to drive one at the recent Washington Auto Show, I can attest that the A3's environmental credentials come wrapped in a very attractive package, requiring no sacrifice other than the sticker price. Even if the comparison to a variety of intrusive green practices lampooned in reductio ad absurdem fashion may have annoyed some observers, the positive side of the message seemed smart and timely: Diesel cars are available now in appealing models delivering greatly-reduced fuel consumption and emissions, but without requiring major behavioral changes on the part of their owners.
Audi's "Green Police" ad, with a musical riff on Cheap Trick's classically-catchy "Dream Police" tune, was a marked contrast to the 2006 Super Bowl ads for Ford's Escape Hybrid and Toyota's Prius Hybrid, both of which appealed to green values of ecological and inter-generational responsibility. By contrast the A3 ad was consistent with the sharper edge of many others in yesterday's broadcast, which included several ads that pushed the boundaries of good taste. But while the New York Times found it "misguided"--heaven forbid that anyone poke fun at meticulously separating our recyclables and choosing the socially-correct shopping bags and energy-saving light bulbs--the ad showed up in at least one top-10 list and topped the voting on the Wall St. Journal's website as of this morning. Without digging a lot deeper, though, I can't tell if that's because it reached its intended audience with its messages that diesels are back, are much more refined than the soot-spewing diesels of the 1970s, and can now actually be considered green. Perhaps many viewers just thought it was clever, or resonated with its critique of some of the lifestyle changes we've been asked to make for the sake of the environment.
In any case, it's interesting to note that the US market share for light-duty diesel cars has been creeping up gradually, apparently matching or exceeding that of hybrids last year. The folks from Bosch, which supplies much of the high-tech gear for the advanced diesel engines under the hood of the Audi A3 TDI, VW Jetta diesel, and other, mostly European-based diesel models that have appeared in the US--including the awesomely-powerful BMW 335d that I also drove at the car show courtesy of Bosch--mentioned figures indicating that the new diesels beat most hybrids on lifecycle ownership costs, mainly due to higher resale value. (Diesel engines are usually good for hundreds of thousands of miles of use, and they don't require expensive battery pack replacement.) Their most obvious selling point is still fuel economy, with the A3 TDI rated at 30 mpg city/42 mpg highway.
That translates into significantly higher miles per dollar, even with diesel fuel selling for modestly more than regular gasoline. It's worth noting that the current diesel premium over unleaded regular of about $0.13 per gallon works out to about 5%, which is much less than the typical 30% fuel economy benefit for diesel relative to the comparable gasoline-powered model. That differential averaged $0.12/gal. for 2009, a far cry from the $0.57/gal. premium in 2008, when the tail end of the economic bubble pushed diesel up against its supply limits here and globally. However, even when the recovery picks up, we're unlikely to see that differential widen to anything like its former level, because the overhang in global refinery capacity has grown so large, and many of the new refineries and refinery expansions coming onstream, including the one at Marathon's Garyville, Louisiana plant, are focused on maximizing diesel production.
At a time when hybrids are still experiencing growing pains, and the market penetration of battery electric cars (EVs) and alternative fuels like E85 depends to a large extent on nearly non-existent infrastructure for recharging or refueling, diesel has a window of opportunity combining new technology with nearly-ubiquitous infrastructure. That same opportunity led to sales of diesel cars in Europe exceeding those of gasoline cars, until a presumably-temporary dip last year. It remains to be seen whether the same phenomenon will happen here, or if consumers will be content to stick with gasoline or jump directly to electricity. I also remain perplexed that neither Ford nor GM has brought any of its successful European diesel passenger car models to the US as a quick and cost-effective way to comply with the new fuel economy rules.
Audi's "Green Police" ad, with a musical riff on Cheap Trick's classically-catchy "Dream Police" tune, was a marked contrast to the 2006 Super Bowl ads for Ford's Escape Hybrid and Toyota's Prius Hybrid, both of which appealed to green values of ecological and inter-generational responsibility. By contrast the A3 ad was consistent with the sharper edge of many others in yesterday's broadcast, which included several ads that pushed the boundaries of good taste. But while the New York Times found it "misguided"--heaven forbid that anyone poke fun at meticulously separating our recyclables and choosing the socially-correct shopping bags and energy-saving light bulbs--the ad showed up in at least one top-10 list and topped the voting on the Wall St. Journal's website as of this morning. Without digging a lot deeper, though, I can't tell if that's because it reached its intended audience with its messages that diesels are back, are much more refined than the soot-spewing diesels of the 1970s, and can now actually be considered green. Perhaps many viewers just thought it was clever, or resonated with its critique of some of the lifestyle changes we've been asked to make for the sake of the environment.
In any case, it's interesting to note that the US market share for light-duty diesel cars has been creeping up gradually, apparently matching or exceeding that of hybrids last year. The folks from Bosch, which supplies much of the high-tech gear for the advanced diesel engines under the hood of the Audi A3 TDI, VW Jetta diesel, and other, mostly European-based diesel models that have appeared in the US--including the awesomely-powerful BMW 335d that I also drove at the car show courtesy of Bosch--mentioned figures indicating that the new diesels beat most hybrids on lifecycle ownership costs, mainly due to higher resale value. (Diesel engines are usually good for hundreds of thousands of miles of use, and they don't require expensive battery pack replacement.) Their most obvious selling point is still fuel economy, with the A3 TDI rated at 30 mpg city/42 mpg highway.
That translates into significantly higher miles per dollar, even with diesel fuel selling for modestly more than regular gasoline. It's worth noting that the current diesel premium over unleaded regular of about $0.13 per gallon works out to about 5%, which is much less than the typical 30% fuel economy benefit for diesel relative to the comparable gasoline-powered model. That differential averaged $0.12/gal. for 2009, a far cry from the $0.57/gal. premium in 2008, when the tail end of the economic bubble pushed diesel up against its supply limits here and globally. However, even when the recovery picks up, we're unlikely to see that differential widen to anything like its former level, because the overhang in global refinery capacity has grown so large, and many of the new refineries and refinery expansions coming onstream, including the one at Marathon's Garyville, Louisiana plant, are focused on maximizing diesel production.
At a time when hybrids are still experiencing growing pains, and the market penetration of battery electric cars (EVs) and alternative fuels like E85 depends to a large extent on nearly non-existent infrastructure for recharging or refueling, diesel has a window of opportunity combining new technology with nearly-ubiquitous infrastructure. That same opportunity led to sales of diesel cars in Europe exceeding those of gasoline cars, until a presumably-temporary dip last year. It remains to be seen whether the same phenomenon will happen here, or if consumers will be content to stick with gasoline or jump directly to electricity. I also remain perplexed that neither Ford nor GM has brought any of its successful European diesel passenger car models to the US as a quick and cost-effective way to comply with the new fuel economy rules.
Labels:
alternate fuels,
audi,
diesel,
fuel economy,
miles per dollar
Thursday, February 04, 2010
EPA's New Biofuel Rules
Yesterday the administration issued an important set of new rules and proposals relating to energy, mainly dealing with expanded biofuel production and the biomass supply chains that must be developed to sustain it, as well as addressing carbon capture and storage (CCS.) There's far more here than I could cover in one posting, so I've chosen to focus on the EPA's finalized Renewable Fuels Standard (RFS) rules, which were first proposed last May and have been the subject of intense study and considerable controversy ever since. While the print edition of the Washington Post characterized these as "A boost for corn-based ethanol" I'm not so sure. In the process of laying out a roadmap for how new corn-based ethanol facilities can contribute to the expansion of biofuel in the US, the EPA effectively froze the output of a large number of older facilities, unless they invest in significant upgrades. It also raised big questions about the future of E85, a blend of 85% ethanol and 15% gasoline that has so far failed to attract much interest from consumers, while suggesting that ethanol might have to share the ultimate 36 billion gallon per year biofuel target for 2022 with large volumes of other, more advanced biofuels.
At the heart of the new biofuel rules, which are designed to implement the goals established by the Energy Independence and Security Act of 2007, is the assessment of lifecycle greenhouse gas emissions from biofuels, including the highly-controversial "indirect land-use impacts" first highlighted in a landmark paper published in Science two years ago and confirmed by subsequent research. Although the EPA's final interpretation of the science has not turned out to be quite the catastrophe that the corn ethanol industry feared--and based on the quote in the Post from the lead author of the relevant research, Dr. Tim Searchinger, might have gone easy on them--it nevertheless constrains the future role of ethanol produced from this source. While clearly stating that facilities producing ethanol from corn starch using natural gas or biofuel for process heat and employing other efficient technologies would qualify for the least-stringent category of renewable fuel, many existing facilities would qualify only under grandfathering that restricts their output to historical levels. That includes newer facilities that started construction by 12/19/07, and essentially all that use coal for heat or dry all their distillers grains byproduct.
In contrast the biodiesel industry, which has been suffering recently, got a shot in the arm with a ruling that qualifies most biodiesel produced from soy oil or waste cooking oil or grease
for the tougher "biomass-based diesel" category, consistent with a 50% reduction in emissions. And the specific RFS quota for 2010 carves out a healthy 1.15 billion gallon target for biodiesel--including retroactive volumes from 2009 that could cause no end of confusion.
Perhaps the most urgent aspect of the requirements for 2010 was the EPA's concession to reality on its cellulosic ethanol quota. The original targets set by Congress called for the use of 100 million gallons of biofuel produced from cellulosic sources this year, but as I've pointed out frequently, bleeding edge technology doesn't just appear on command. The EPA's estimate of how much cellulosic biofuel will actually be available in 2010--and thus mandated for use--is just 6.5 million gallons. And if fuel blenders aren't able to acquire even that much, EPA has provided the alternative of paying $1.56/gallon in penalties, instead. That sounds cheap until you realize that this only pays for an attribute; they still have to buy the gasoline or conventional ethanol on which to apply this Renewable Identification Number, or RIN. Based on current prices, the total cost for such virtual cellulosic ethanol could thus exceed $3.50/gal., compared to around $2.00 for wholesale (untaxed) gasoline.
I confess I didn't make it through the entire 418 page "preamble" to the regulation, but what I found there was a fascinating picture of how much the official view of biofuels has evolved since the Congress set us on this path at the end of 2007. Then, hopes for E85 powering many millions of "flexible fuel vehicles" (FFVs) ran high. Today, reading between the lines, there are hints that EPA might regard E85 as a failed product that may no longer be necessary for pushing biofuel into the market. Their statistics on E85 paint a bleak picture. According to EPA, out of a total retail gasoline market of 138 billion gallons in 2008, E85 accounted for just 12 million gallons. Such low volumes are partially attributable to the fact that there are still only 2,100 retail facilities in the US with an E85 pump, and only 8 million FFVs on the road, out of a US vehicle fleet of 240 million or so. Yet after taking these constraints into account, the EPA calculated that FFV owners bought E85 just 4% of the time. They offer a variety of reasons for this, including concerns about reduced range on the lower-energy fuel, but mainly point to the much higher average price of E85 compared to unleaded regular on an energy-equivalent basis. In other words, consumers are choosing value and maximizing their miles per dollar. So it wouldn't just require a big increase in the number of E85 pumps and FFVs to make E85 successful; the product must be priced a heck of a lot cheaper than it has been, reducing the incentive for dealers to sell what today is a very low-volume product. Catch-22?
How much of a problem this poses for ethanol producers depends on whether the EPA relaxes the 10% limit on ethanol blended into normal gasoline, as the ethanol industry has petitioned them to do, against most auto industry advice. It also depends on how quickly non-ethanol biofuels such as biobutanol and biomass-derived hydrocarbons--gasoline or diesel from algae, bacteria, or gasification--that would be fully compatible with current cars and infrastructure take off. It's worth noting that the new rules explicitly qualify biobutanol from corn starch in the same category of renewable fuel as the best corn ethanol pathways, and leave the door open to qualify these other fuels if they satisfy EPA's emissions framework. The preamble includes one scenario in which such fuels account for nearly as much of the 2022 biofuel target as corn ethanol.
Needless to say, I haven't had time to go through all the intricate details of the EPA's new RFS regulations. Their ultimate impact may depend as much on some of those nuances as on the big-picture elements I spotted in my cursory review, and I can easily picture a host of law firm, trade association, and energy company personnel poring over them for the next couple of weeks. Still, although what I saw was hardly the death-knell for the existing corn ethanol industry that some might have expected or hoped for, in the process of codifying the means for implementing the intent of Congress in its 2007 legislation the agency has laid out a vision of a much more diverse and competitive biofuel industry than the architects of that bill could have guessed just a couple of years ago.
At the heart of the new biofuel rules, which are designed to implement the goals established by the Energy Independence and Security Act of 2007, is the assessment of lifecycle greenhouse gas emissions from biofuels, including the highly-controversial "indirect land-use impacts" first highlighted in a landmark paper published in Science two years ago and confirmed by subsequent research. Although the EPA's final interpretation of the science has not turned out to be quite the catastrophe that the corn ethanol industry feared--and based on the quote in the Post from the lead author of the relevant research, Dr. Tim Searchinger, might have gone easy on them--it nevertheless constrains the future role of ethanol produced from this source. While clearly stating that facilities producing ethanol from corn starch using natural gas or biofuel for process heat and employing other efficient technologies would qualify for the least-stringent category of renewable fuel, many existing facilities would qualify only under grandfathering that restricts their output to historical levels. That includes newer facilities that started construction by 12/19/07, and essentially all that use coal for heat or dry all their distillers grains byproduct.
In contrast the biodiesel industry, which has been suffering recently, got a shot in the arm with a ruling that qualifies most biodiesel produced from soy oil or waste cooking oil or grease
for the tougher "biomass-based diesel" category, consistent with a 50% reduction in emissions. And the specific RFS quota for 2010 carves out a healthy 1.15 billion gallon target for biodiesel--including retroactive volumes from 2009 that could cause no end of confusion.
Perhaps the most urgent aspect of the requirements for 2010 was the EPA's concession to reality on its cellulosic ethanol quota. The original targets set by Congress called for the use of 100 million gallons of biofuel produced from cellulosic sources this year, but as I've pointed out frequently, bleeding edge technology doesn't just appear on command. The EPA's estimate of how much cellulosic biofuel will actually be available in 2010--and thus mandated for use--is just 6.5 million gallons. And if fuel blenders aren't able to acquire even that much, EPA has provided the alternative of paying $1.56/gallon in penalties, instead. That sounds cheap until you realize that this only pays for an attribute; they still have to buy the gasoline or conventional ethanol on which to apply this Renewable Identification Number, or RIN. Based on current prices, the total cost for such virtual cellulosic ethanol could thus exceed $3.50/gal., compared to around $2.00 for wholesale (untaxed) gasoline.
I confess I didn't make it through the entire 418 page "preamble" to the regulation, but what I found there was a fascinating picture of how much the official view of biofuels has evolved since the Congress set us on this path at the end of 2007. Then, hopes for E85 powering many millions of "flexible fuel vehicles" (FFVs) ran high. Today, reading between the lines, there are hints that EPA might regard E85 as a failed product that may no longer be necessary for pushing biofuel into the market. Their statistics on E85 paint a bleak picture. According to EPA, out of a total retail gasoline market of 138 billion gallons in 2008, E85 accounted for just 12 million gallons. Such low volumes are partially attributable to the fact that there are still only 2,100 retail facilities in the US with an E85 pump, and only 8 million FFVs on the road, out of a US vehicle fleet of 240 million or so. Yet after taking these constraints into account, the EPA calculated that FFV owners bought E85 just 4% of the time. They offer a variety of reasons for this, including concerns about reduced range on the lower-energy fuel, but mainly point to the much higher average price of E85 compared to unleaded regular on an energy-equivalent basis. In other words, consumers are choosing value and maximizing their miles per dollar. So it wouldn't just require a big increase in the number of E85 pumps and FFVs to make E85 successful; the product must be priced a heck of a lot cheaper than it has been, reducing the incentive for dealers to sell what today is a very low-volume product. Catch-22?
How much of a problem this poses for ethanol producers depends on whether the EPA relaxes the 10% limit on ethanol blended into normal gasoline, as the ethanol industry has petitioned them to do, against most auto industry advice. It also depends on how quickly non-ethanol biofuels such as biobutanol and biomass-derived hydrocarbons--gasoline or diesel from algae, bacteria, or gasification--that would be fully compatible with current cars and infrastructure take off. It's worth noting that the new rules explicitly qualify biobutanol from corn starch in the same category of renewable fuel as the best corn ethanol pathways, and leave the door open to qualify these other fuels if they satisfy EPA's emissions framework. The preamble includes one scenario in which such fuels account for nearly as much of the 2022 biofuel target as corn ethanol.
Needless to say, I haven't had time to go through all the intricate details of the EPA's new RFS regulations. Their ultimate impact may depend as much on some of those nuances as on the big-picture elements I spotted in my cursory review, and I can easily picture a host of law firm, trade association, and energy company personnel poring over them for the next couple of weeks. Still, although what I saw was hardly the death-knell for the existing corn ethanol industry that some might have expected or hoped for, in the process of codifying the means for implementing the intent of Congress in its 2007 legislation the agency has laid out a vision of a much more diverse and competitive biofuel industry than the architects of that bill could have guessed just a couple of years ago.
Labels:
algae,
biodiesel,
biofuel,
butanol,
cellulosic ethanol,
renewable fuel standard,
rfs
Wednesday, February 03, 2010
Deficits and Energy
After reading several articles about the administration's proposed 2011 fiscal-year budget, I decided to look through the figures myself. My primary interest was in finding indications of what might lie in store for energy-related taxes and incentives. However, once I noticed how the projected deficits accumulate and examined the assumptions behind them, it struck me that the larger concern for energy and everything else is whether this budget represents a reasonable and sustainable picture of our future national finances. The expected 10-year deficit for the 2009-2018 period appears to have grown by $1.5 trillion relative to last year's budget. And that's after counting roughly $2 T in newly-proposed spending reductions and tax increases, including higher taxes on the energy industry. Against that backdrop the extra few billion dollars for renewables and other favored energy technologies nearly get lost in the rounding.
As a veteran strategic planner, I started by examining the economic assumptions for the budget. While everyone hopes for a strong rebound that would boost tax revenues by moving millions of the un- and under-employed back onto the tax rolls, it seems overly optimistic to assume that on top of an expected 2.7% growth rate in real GDP for this year, real GDP growth would then average 4% per year from 2011-2015 (calendar years.) The last time we had a five-year growth spurt like that was in the late 1990s--thanks to the Tech Bubble--and prior to that in the late 1980s. Yet despite such strong projected growth and the addition of roughly $2 T in "savings" and new taxes, the Treasury would still need to borrow an additional $14 T over the next decade. Even less realistically, perhaps, given such robust growth and massive borrowing, the budget also assumes that consumer-price inflation will not rise above 2.1% for the next decade, while nominal interest rates go up only gradually, never averaging more than 5.3% for 10-year Treasuries.
All this suggests that the current budget might be merely a placeholder awaiting the recommendations of the proposed deficit-reduction commission, while generating a set of figures that just manages to keep the total federal debt level--Table S.14, not the same as the "debt held by the public" shown in summary table S-1--below around 106% of GDP. Of course, this hinges on achieving those higher tax revenues, some from growth and some from higher taxes, including the termination of the Bush tax cuts for "upper-income" Americans. Even if the Congress passed all the required tax legislation, which is not inconceivable since for the biggest portion they'd be voting for a tax cut for everyone except "upper-income" taxpayers, the chances of things turning out even this well seem low. If growth doesn't reach the projected levels and stay there for years, tax revenues will fall short, deficits will grow, and at some point interest rates will rise, requiring even bigger deficits to cover the cost of debt service that under this budget exceeds $800 billion a year by 2020.
Then there are the tax increases, starting with energy. The big difference vs. last year is the absence of $646 billion from cap & trade. Even if cap & trade is eventually enacted, it now seems likely that most of its proceeds would be rebated to taxpayers or spent on new energy programs, so it doesn't look like a way to close the budget gap. The proposed budget has roughly $3.6 billion per year in increased revenue from eliminating what the oil industry regards as appropriate tax benefits and the administration calls tax loopholes. Either way the budget would increase the cost of producing oil and gas in the US by around $0.60 per barrel of oil equivalent (BOE) after tax. While that won't break the industry, it also won't make US exploration and production any more attractive or competitive. In case you're wondering why we should care about that in light of our new emphasis on green energy, it turns out that the entire energy contribution of the record 10,000 MW of wind turbines installed in the US last year equates to about 100,000 BOE per day, the equivalent of one good-sized Gulf of Mexico oil platform or roughly 0.2% of our total energy consumption. We need more renewables and more conventional energy.
The budget also includes roughly three-quarters of a billion over 10 years in new fees on "non-producing oil and gas leases." Grounded in the mistaken notion of "idle leases," this was ill-advised last year and remains so, not just because oil companies don't bid on leases to take them off the market and keep them idle--they already pay rentals on any leases that aren't producing, which revert to the government after 10 years--but because adding these fees will merely reduce the up-front bonuses companies would be willing to bid to get them in the first place. As a result, the net revenue from this item ought to be zero.
Of course in terms of total revenue all of this pales in comparison to what the administration expects to collect from upper-income Americans, who seem unlikely to get any more sympathy than the oil companies. (Ironically this segment probably includes the bulk of the potential early buyers for the advanced technology vehicles that the government is lending or granting carmakers billions to produce.) The budget includes about $700 billion of additional revenue over 10 years from reversion to the pre-2001 tax rates for this group, along with some less obvious increases involving phaseouts of itemized deductions and exemptions and the treatment of deductions for those in the new 39.6% federal bracket as though they were incurred in the 28% tax bracket. Together these features would impose effective marginal tax rates much higher than that notional 40% on the folks at the bottom of the new bracket, creating a heck of a disincentive on earning a little more once you're near that threshold. But aside from making additional work or investment unrewarding for those unlucky enough to qualify narrowly for this bracket, this approach increases our collective reliance on this group to fund our government. These folks were already paying 86.3% of the federal income tax before these increases, and that share would go up under this budget. I wouldn't call that either reform or a sound basis for responsible democracy.
What we're left with, then, is a federal budget that even under a rosy set of assumptions expands the cumulative deficit and total US indebtedness into a range that greatly multiplies the large-scale uncertainties we face, while making minimal cuts to spending and increasing taxes only on unpopular corporations and upper-income Americans. Unfortunately, this scenario doesn't look conducive to generating the enormous private investments in new energy technology and infrastructure that will be necessary and that the government can't afford to make, particularly as mounting debt constrains its freedom of action. We seem to be stuck in a zone in which the only real solutions are unpopular, while most of the ideas that are popular wouldn't be real solutions.
As a veteran strategic planner, I started by examining the economic assumptions for the budget. While everyone hopes for a strong rebound that would boost tax revenues by moving millions of the un- and under-employed back onto the tax rolls, it seems overly optimistic to assume that on top of an expected 2.7% growth rate in real GDP for this year, real GDP growth would then average 4% per year from 2011-2015 (calendar years.) The last time we had a five-year growth spurt like that was in the late 1990s--thanks to the Tech Bubble--and prior to that in the late 1980s. Yet despite such strong projected growth and the addition of roughly $2 T in "savings" and new taxes, the Treasury would still need to borrow an additional $14 T over the next decade. Even less realistically, perhaps, given such robust growth and massive borrowing, the budget also assumes that consumer-price inflation will not rise above 2.1% for the next decade, while nominal interest rates go up only gradually, never averaging more than 5.3% for 10-year Treasuries.
All this suggests that the current budget might be merely a placeholder awaiting the recommendations of the proposed deficit-reduction commission, while generating a set of figures that just manages to keep the total federal debt level--Table S.14, not the same as the "debt held by the public" shown in summary table S-1--below around 106% of GDP. Of course, this hinges on achieving those higher tax revenues, some from growth and some from higher taxes, including the termination of the Bush tax cuts for "upper-income" Americans. Even if the Congress passed all the required tax legislation, which is not inconceivable since for the biggest portion they'd be voting for a tax cut for everyone except "upper-income" taxpayers, the chances of things turning out even this well seem low. If growth doesn't reach the projected levels and stay there for years, tax revenues will fall short, deficits will grow, and at some point interest rates will rise, requiring even bigger deficits to cover the cost of debt service that under this budget exceeds $800 billion a year by 2020.
Then there are the tax increases, starting with energy. The big difference vs. last year is the absence of $646 billion from cap & trade. Even if cap & trade is eventually enacted, it now seems likely that most of its proceeds would be rebated to taxpayers or spent on new energy programs, so it doesn't look like a way to close the budget gap. The proposed budget has roughly $3.6 billion per year in increased revenue from eliminating what the oil industry regards as appropriate tax benefits and the administration calls tax loopholes. Either way the budget would increase the cost of producing oil and gas in the US by around $0.60 per barrel of oil equivalent (BOE) after tax. While that won't break the industry, it also won't make US exploration and production any more attractive or competitive. In case you're wondering why we should care about that in light of our new emphasis on green energy, it turns out that the entire energy contribution of the record 10,000 MW of wind turbines installed in the US last year equates to about 100,000 BOE per day, the equivalent of one good-sized Gulf of Mexico oil platform or roughly 0.2% of our total energy consumption. We need more renewables and more conventional energy.
The budget also includes roughly three-quarters of a billion over 10 years in new fees on "non-producing oil and gas leases." Grounded in the mistaken notion of "idle leases," this was ill-advised last year and remains so, not just because oil companies don't bid on leases to take them off the market and keep them idle--they already pay rentals on any leases that aren't producing, which revert to the government after 10 years--but because adding these fees will merely reduce the up-front bonuses companies would be willing to bid to get them in the first place. As a result, the net revenue from this item ought to be zero.
Of course in terms of total revenue all of this pales in comparison to what the administration expects to collect from upper-income Americans, who seem unlikely to get any more sympathy than the oil companies. (Ironically this segment probably includes the bulk of the potential early buyers for the advanced technology vehicles that the government is lending or granting carmakers billions to produce.) The budget includes about $700 billion of additional revenue over 10 years from reversion to the pre-2001 tax rates for this group, along with some less obvious increases involving phaseouts of itemized deductions and exemptions and the treatment of deductions for those in the new 39.6% federal bracket as though they were incurred in the 28% tax bracket. Together these features would impose effective marginal tax rates much higher than that notional 40% on the folks at the bottom of the new bracket, creating a heck of a disincentive on earning a little more once you're near that threshold. But aside from making additional work or investment unrewarding for those unlucky enough to qualify narrowly for this bracket, this approach increases our collective reliance on this group to fund our government. These folks were already paying 86.3% of the federal income tax before these increases, and that share would go up under this budget. I wouldn't call that either reform or a sound basis for responsible democracy.
What we're left with, then, is a federal budget that even under a rosy set of assumptions expands the cumulative deficit and total US indebtedness into a range that greatly multiplies the large-scale uncertainties we face, while making minimal cuts to spending and increasing taxes only on unpopular corporations and upper-income Americans. Unfortunately, this scenario doesn't look conducive to generating the enormous private investments in new energy technology and infrastructure that will be necessary and that the government can't afford to make, particularly as mounting debt constrains its freedom of action. We seem to be stuck in a zone in which the only real solutions are unpopular, while most of the ideas that are popular wouldn't be real solutions.
Labels:
deficit,
natural gas,
oil companies,
renewable energy,
tax
Monday, February 01, 2010
Advantage China?
A spate of articles on China over the weekend, including one in the New York Times entitled, "China Leading Global Race to Make Clean Energy" got me thinking about our reaction to such reports. The Times article included some important insights about the role of relative scale and growth rates in fostering the emergence of global wind and solar power competitors from China. From a wider perspective, however, I worry that we're beginning to apply the same kind of mental inflation of competitor attributes that made "Japan, Inc." seem such an overwhelming juggernaut in the late 1970s and most of the 1980s, when it appeared that Japan would dominate every important industry and own every scrap of signature US real estate, starting with Rockefeller Center and Pebble Beach.
In the last decade or so I've watched attitudes toward China evolve from what I used to call "China Big"--an unprecedented opportunity for global companies due to the size of its emerging consumer and financial markets--to something like "China Smarter", which compares that country's growth and the policies that have sustained it to those that helped guide the mature US and European economies down the path of unsustainable asset bubbles. During this interval Chinese renewable energy firms have grown from low-cost suppliers of parts and raw materials to established EU and US equipment manufacturers, to become integrated competitors in their own right, capable of undercutting the German solar power industry in its home market--to choose just one example.
As the Times points out, China gains a big edge in renewable energy because its entire power sector must grow so rapidly to support economic growth that is expected to average 8% this year, after a decade of double-digit growth interrupted only by last year's dip to 6% or so. That means that while renewables are still more expensive than the coal power plants that have dominated the Chinese market, they don't have to compete head-to-head with them; there's enough growth for all. Contrast that to a US power market that has shrunk by an astonishing 6% since 2007, instead of continuing to grow at its formerly-dependable 1-2% per year pace. The size of China's domestic expansion and the urgency of keeping it going, together with the increasing sophistication of its low-cost manufacturing base, make it nearly inevitable that China would become a serious competitor in an industry for which the biggest factor governing market penetration--other than the degree of regulatory and subsidy support they receive--is making renewables more cost-competitive with traditional energy sources. The more that depends on experience-curve effects rather than technology breakthroughs, the more this competition will favor China, for now. Throw in concerns about access to the rare earths and metals required by much of this technology, and China's long-term advantage in renewables looks even bigger.
I don't want to seem blasé about the challenge this represents, but I also think we should keep it in perspective, as we often failed to do concerning Japan in the 1980s, when its keiretsu companies seemed 10 feet tall and business bestsellers touted Japanese management techniques and warned that Japan was on the verge of overtaking the US in the global economy. Again, consider renewable energy. In 2008 the value of all wind turbines installed globally was on the order of $70 billion and for grid-connected solar power hardware around $20 billion, out of global renewable energy investments of $120 billion. That puts global wind and solar equipment sales at roughly the level of US aerospace exports for 2008, and about half the size of the total US aerospace market. That's big enough to want to retain a meaningful share of the market, but not so big that the entire economy depends on it. Or does it?
The Times article included the worrying suggestion that the US might someday be as dependent on imported Chinese renewable energy gear as it currently is on imported oil from the Middle East--never mind that the latter made up just a fifth of net US oil imports and 12% of total US oil supplies in 2008. Yet even if that analogy were correct, there's a huge difference in the economic and security implications of these two positions. We understand from experience that even a partial suspension of US oil imports would create an immediate price spike and send a shock throughout the economy. It's hard to see how the impact of even a complete embargo on sales of wind and solar equipment from China to the US could ever approach that. Although curtailed renewable energy equipment imports might disrupt the activities of companies installing them and spoil the returns of those parties financing them, existing facilities would keep turning out power. Once you've imported a wind turbine or solar module and set it up, you own it and its output until it wears out. These risks simply don't equate in the manner the Times asserts. Moreover, they are naturally limited by the significant practical challenges faced by intermittent and cyclical power generation technologies. Just read the DOE's analysis of a 20% wind power scenario to see what's necessary to achieve even that threshold.
Unfortunately, concerns about China's advances in renewable energy carry extra weight, because they align with a larger pattern of China envy exemplified by the talk of a "Beijing Consensus" that Tom Friedman apparently encountered at the World Economic Forum in Davos. China's "Confucian-Communist-Capitalist" model certainly offers speed and clarity of purpose that our own system has matched only at times of immediate national crisis. However, it's worth recalling that in the 1930s the Soviet and Italian models had their admirers here, too, for their ability to get things done, compared to the messiness of a capitalist democracy. However discredited the US economy may look after a couple of bad years, I'll take that messiness, as long as we don't manage to kill the innovative spirit--and the incentives that drive it--that enabled us to adapt the best of Japan's ideas while continuing on a trajectory that eclipsed Japan's success over the last two decades, even when you factor in the Great Recession. I'm more worried about navigating the geopolitical challenges that China's rise will create over the next few decades, and ensuring that they don't end in the kind of confrontation that resulted from Germany's rise a century ago.
In the last decade or so I've watched attitudes toward China evolve from what I used to call "China Big"--an unprecedented opportunity for global companies due to the size of its emerging consumer and financial markets--to something like "China Smarter", which compares that country's growth and the policies that have sustained it to those that helped guide the mature US and European economies down the path of unsustainable asset bubbles. During this interval Chinese renewable energy firms have grown from low-cost suppliers of parts and raw materials to established EU and US equipment manufacturers, to become integrated competitors in their own right, capable of undercutting the German solar power industry in its home market--to choose just one example.
As the Times points out, China gains a big edge in renewable energy because its entire power sector must grow so rapidly to support economic growth that is expected to average 8% this year, after a decade of double-digit growth interrupted only by last year's dip to 6% or so. That means that while renewables are still more expensive than the coal power plants that have dominated the Chinese market, they don't have to compete head-to-head with them; there's enough growth for all. Contrast that to a US power market that has shrunk by an astonishing 6% since 2007, instead of continuing to grow at its formerly-dependable 1-2% per year pace. The size of China's domestic expansion and the urgency of keeping it going, together with the increasing sophistication of its low-cost manufacturing base, make it nearly inevitable that China would become a serious competitor in an industry for which the biggest factor governing market penetration--other than the degree of regulatory and subsidy support they receive--is making renewables more cost-competitive with traditional energy sources. The more that depends on experience-curve effects rather than technology breakthroughs, the more this competition will favor China, for now. Throw in concerns about access to the rare earths and metals required by much of this technology, and China's long-term advantage in renewables looks even bigger.
I don't want to seem blasé about the challenge this represents, but I also think we should keep it in perspective, as we often failed to do concerning Japan in the 1980s, when its keiretsu companies seemed 10 feet tall and business bestsellers touted Japanese management techniques and warned that Japan was on the verge of overtaking the US in the global economy. Again, consider renewable energy. In 2008 the value of all wind turbines installed globally was on the order of $70 billion and for grid-connected solar power hardware around $20 billion, out of global renewable energy investments of $120 billion. That puts global wind and solar equipment sales at roughly the level of US aerospace exports for 2008, and about half the size of the total US aerospace market. That's big enough to want to retain a meaningful share of the market, but not so big that the entire economy depends on it. Or does it?
The Times article included the worrying suggestion that the US might someday be as dependent on imported Chinese renewable energy gear as it currently is on imported oil from the Middle East--never mind that the latter made up just a fifth of net US oil imports and 12% of total US oil supplies in 2008. Yet even if that analogy were correct, there's a huge difference in the economic and security implications of these two positions. We understand from experience that even a partial suspension of US oil imports would create an immediate price spike and send a shock throughout the economy. It's hard to see how the impact of even a complete embargo on sales of wind and solar equipment from China to the US could ever approach that. Although curtailed renewable energy equipment imports might disrupt the activities of companies installing them and spoil the returns of those parties financing them, existing facilities would keep turning out power. Once you've imported a wind turbine or solar module and set it up, you own it and its output until it wears out. These risks simply don't equate in the manner the Times asserts. Moreover, they are naturally limited by the significant practical challenges faced by intermittent and cyclical power generation technologies. Just read the DOE's analysis of a 20% wind power scenario to see what's necessary to achieve even that threshold.
Unfortunately, concerns about China's advances in renewable energy carry extra weight, because they align with a larger pattern of China envy exemplified by the talk of a "Beijing Consensus" that Tom Friedman apparently encountered at the World Economic Forum in Davos. China's "Confucian-Communist-Capitalist" model certainly offers speed and clarity of purpose that our own system has matched only at times of immediate national crisis. However, it's worth recalling that in the 1930s the Soviet and Italian models had their admirers here, too, for their ability to get things done, compared to the messiness of a capitalist democracy. However discredited the US economy may look after a couple of bad years, I'll take that messiness, as long as we don't manage to kill the innovative spirit--and the incentives that drive it--that enabled us to adapt the best of Japan's ideas while continuing on a trajectory that eclipsed Japan's success over the last two decades, even when you factor in the Great Recession. I'm more worried about navigating the geopolitical challenges that China's rise will create over the next few decades, and ensuring that they don't end in the kind of confrontation that resulted from Germany's rise a century ago.
Labels:
China,
economic growth,
japan,
renewable energy,
solar power,
subsidy,
wind power
Subscribe to:
Posts (Atom)