Thursday, April 29, 2010

Energy from All Around Us

It's somewhat ironic that the long-awaited approval of the Cape Wind offshore wind project by the Department of Interior (DOI) should come in the same week that the nation's attention is focused on the problems of another, more traditional offshore energy project. Although the renewable electricity from the former scarcely substitutes for petroleum from the latter, Cape Wind is nevertheless emblematic of an intentional shift from energy sourced far away, in places like the deepwater Gulf of Mexico, to energy derived from sources all around us. If Secretary Salazar had turned it down, it would have cast serious doubts on the administration's entire clean energy agenda. However, concurrence with this one project doesn't answer all questions concerning the larger shift, of which it represents just a small component. Similar issues are bound to come up with increasing frequency as the transition to new energy continues.

Cape Wind and the Macondo prospect that the Deepwater Horizon rig was drilling into represent opposite poles of the energy spectrum, and not just because the latter is now leaking oil into the marine environment at a rate that the latest estimate puts at 5,000 barrels per day, much higher than initially thought. Cape Wind would tap into the clean and renewable, but extremely diffuse energy sources that surround us. After taking into account the restrictions imposed by DOI, its 130 turbines would on average generate as much electricity as a gas turbine power plant consuming a quantity of natural gas equivalent to 6,000 bbls/day of oil. In other words, it takes a very large array of offshore wind turbines to match the energy in the oil currently leaking from a single well. Platforms similar to what BP might have been planning to install after successfully completing the exploration of Macondo routinely produce up to 20 times that much oil.

The implications of this huge difference in energy density are clear. Without the energy concentration that nature has embedded in fossil fuels over many millennia, the hardware required to tap natural energy flows in real time becomes vast in extent. Generating 20% of US electricity needs from wind, which some see as just the beginning, will ultimately require more than 8 times as much wind capacity as the 35,000 MW installed as of the end of last year, even if US electricity demand remains static in the interim. Solar power, which last year generated just 0.02% of our electricity, would have to increase by a much larger factor. This is one of many reasons that increased reliance on nuclear power is such an important element of the transition to more sustainable energy sources, because nuclear--and to a lesser extent geothermal power--represents a critical source of highly-concentrated, low-emission energy. The more nuclear in the mix, replacing baseload coal, the less we must rely on distributed energy gathered in our immediate vicinity.

In any case, in order to obtain a much larger portion of our energy diet from sources like onshore and offshore wind and solar power, projects like Cape Wind must go from being rarities to ubiquitous features of our seascapes and landscapes. The opposition to Cape Wind that has delayed this project for years is focused on a central dilemma of that shift: Many of the same underlying trends that lead us to want to harness clean energy from wind, sunlight and geothermal heat have also increased our focus on the broadly-defined environmental impacts of doing so.

Our grandparents wouldn't have blinked at putting up tens of thousands of wind turbines, let alone the few hundred slated for Nantucket Sound. They'd have thought of them as signs of progress, just as they viewed oil derricks and power lines. It's incumbent on us to balance our more modern sensibilities related to the "viewscape" with fundamental environmental challenges of climate change and sustainability, as well as the need to sustain the energy supplies our civilization requires. Approving Cape Wind--whether it eventually gets built or not--is entirely consistent with those imperatives.

Tuesday, April 27, 2010

Assessing the Impact of Deepwater Horizon

As I noted Friday, it's still early to try to assess the full impact of the disaster that befell the Deepwater Horizon drilling vessel last week. Having known many people over the course of my career who were engaged in this kind of work, my first reaction is one of deep sympathy for the missing--now presumed lost--and their families, friends and co-workers, and to those who were injured. At the same time, speculation is mounting concerning the ramifications for US energy policy. For those of us who have supported expanded drilling off the US coastline, this accident comes as an unpleasant reminder that such activities can never be undertaken entirely without risk. If we are indeed going to resume drilling in regions beyond the central and western Gulf of Mexico, we should go into it with our eyes wide open: understanding the risks, but also putting them into perspective.

My grad school statistics professor always reminded us that whatever the odds of something happening might have been beforehand, after the fact they reduced to zero or one. If you were involved in a car accident today, it doesn't matter much to you that the chances were very low, or that a couple of hundred million other people weren't. But while reminders of the very low probability of accidents such as the destruction of the Deepwater Horizon (DH) drilling platform and the ensuing oil leak might seem a little lame at this point, it remains crucially important to any decisions on future drilling policies whether such events are rare or common. And the fact that a similar, though less catastrophic accident occurred in the Timor Sea off Australia last summer doesn't make offshore drilling blowouts frequent occurrences. In fact, they are rare as hens' teeth. According to the latest rig count data from Baker Hughes, there are currently 53 drilling rigs operating in US waters. Together with the more than 3,500 operating oil & gas platforms in the portions of the Gulf of Mexico tracked by the Minerals Management Service of the Department of Interior, that represents many thousands of offshore wells drilled, over many years, without an incident like the one we've just seen.

Another key aspect of assessing the risks of offshore drilling relates to the quantities of oil spilled. The well that DH was drilling now appears to be leaking around 1,000 barrels per day, or 42,000 gallons per day. If it has been leaking steadily at that rate since the platform sank last Thursday, then cumulative oil leaked so far is something like 5,000 barrels, or just under 210,000 gallons. That's hardly insignificant, particularly given the way even a small quantity of oil can spread out across a wide expanse of water as a thin film. Yet in relative terms, this is still modest, compared to other sources of marine oil spills. The shipping industry measures spills in tons of oil. On that scale, the volume that the DH well has presumably leaked so far works out to roughly 700 tons. The International Tanker Owners Pollution Federation Ltd. (ITOPF) has identified 444 oil spills of equal or greater magnitude since 1970, plus another 1,200 between 7 tons (50 bbls) and 700 tons. (The annual number of such spills has been declining steadily, even as the volume of ship-borne trade has grown.)

Nor are the production and shipping of petroleum the largest sources of oil leaked into the marine environment. According to a study reported by NASA in 2000, the annual oil seepage from natural sources in the Gulf of Mexico amounts to roughly twice the quantity spilled from the Exxon Valdez. On that basis, if the DH well leaked all year at its current rate, it would not equal the natural leakage rate in the Gulf. Now, it's a lot different having that much oil leak from one point on the seabed, instead of from hundreds of spots, which allows natural processes to break down most of it before it comes onshore. Nevertheless, as confirmed by a 2003 study published by the National Research Council, from 1990-1999 oil platforms contributed only a tiny fraction of the oil leaked into the global marine environment, with most of it attributable to natural seeps, spills during oil consumption, and shipping.

However we assess the cost of such spills when they happen, the benefits of offshore drilling are still compelling. Despite a 9% reduction in demand since the start of the recession, the US consumes more than 18 million bbls per day of oil, and as of last year, 52% of that was supplied by net imports. When you back out the oil we receive by pipeline from Canada, that works out to more than 9 million bbl/day of imports arriving by sea--with all the risks of tanker spills that entails. Even with oil prices as low as they were last year, the cost of our net oil imports accounted for roughly 11% of our total import bill in 2009, and thus for at least that much of our trade deficit. Even vigorous efforts to reduce oil consumption (and its associated emissions) by improving fuel economy and ramping up renewable energy do not diminish the need to produce as much oil from domestic sources as we can. Biofuels still meet just 4% of our liquid fuels needs, and additional wind, solar and geothermal power don't displace any significant quantity of oil, because oil accounted for less than 1% of US power generation last year.

All of this is small consolation to the residents of coastal communities that are worried that an oil slick could come onshore within a few days. I am also sympathetic to concerns such as those expressed in editorials like this one in a Pensacola, FL newspaper. I grew up in a beach town in California, where memories of the much larger Santa Barbara spill in 1969 still affect the public's view of offshore drilling. But rather than seeing Deepwater Horizon as a reason to call a halt to President Obama's plans to expand offshore drilling, I view it as similar to an aviation accident: a tragedy that calls for improved technology and procedures, not an end to flight. It also reinforces my conviction that it is only right and fair for states like mine that are adjacent to the federal waters in which drilling would occur to share in the royalties this will generate. Politicians in inland states might see those royalties as belonging to the whole country, but their states won't bear any of the risk, no matter how small it might be before the fact.

Friday, April 23, 2010

Eating the Seed Corn

Some days it's hard to find a salient topic on which to blog. Today I'm spoiled for choice but wish I weren't, at least in the case of one of the three I considered. The full implications of the Deepwater Horizon disaster won't be known until rescue efforts end, the well is brought under control, and the resulting oil spill contained. That doesn't prevent speculation and knee-jerk responses, but I'll reserve my analysis until the facts are clearer. Meanwhile, the EU has been forced to release yet another study finding that many biofuels could be worse for the environment than the petroleum products they are intended to replace. I'll say more about the issues that raises, soon. For today, I want to focus on the challenge that Bill Gates highlighted in an op-ed published in today's Washington Post, concerning the need for significantly more energy R&D spending by the US government.

Since recently turning his attention to energy, Bill Gates has made some astute observations about it, while falling into few of the traps that await those attempting to transfer their high-tech experience to this much larger, more basic industry. Past remarks suggest he grasps the scale of the problem. His recommendation for more innovation and explanation for why energy R&D has been underfunded by the public and private sectors are apt, though I'm less sure that the R&D investment rates of firms whose business is selling technology provide quite the right basis of comparison for an industry that produces vast quantities of interchangeable commodities. Nevertheless, he's right that discovering and developing revolutionary energy technologies is beyond the scope of most companies that operate on a scale to be able to afford the sums required. Most R&D by major oil & gas or power generation companies is devoted to improving what they're already doing, for good reasons. Things that don't deliver prompt results inflate costs without providing immediately-offsetting benefits, making companies pursuing such efforts less competitive in the market and often less attractive to investors.

Government doesn't have these constraints, and historically it has been a relatively uncontroversial role of government, even in the US, to devote significant resources to long-term projects. (The old Bell Labs looks like an exception, until you consider that most of its truly ground-breaking work in basic science was undertaken when its corporate parent functioned as a tightly-regulated monopoly--effectively an extension of government.) Mr. Gates suggests that spending less than $3 billion per year on clean energy research is inadequate, and I must agree. However, Gates stops short of explaining that the federal government already spends much more than that on clean energy, but that most of it is focused on the deployment of current technologies. As of its most recent update, the US Treasury had issued more than $3 billion in Renewable Energy Grants to wind, solar, biomass and geothermal project developers under the stimulus, and this is just a fraction of what the government is now spending on direct and indirect subsidies, tax incentives, loans and loan guarantees to support the deployment of corn ethanol and advanced biofuels; wind turbines, solar panels and the factories to make them; factories to build electric vehicles and the advanced batteries to power them; as well as new nuclear power plants, to name a few.

The innovation imperative articulated by Bill Gates thus stands in tension with a range of policies focused on overcoming the market barriers that current alternative energy technologies face. For some of these technologies, the case for providing temporary subsidies to enable them to become sufficiently established to benefit from economies of scale and experience-curve effects is solid. The same can be said for assisting a new generation of nuclear power plants that are so expensive that no company can risk its entire future to build the first one. This logic is much less compelling for conventional biofuels that are still not competitive without subsidies that are now in their fourth decade, or federal loans to finance EV factories for companies that have made just a handful of hand-built cars in their entire existence.

Without a major new funding source dedicated to long-term federal energy R&D, and in an era of increasingly-unsustainable budget deficits, the choice between energy R&D and the deployment of existing energy technologies becomes a zero-sum game. Eating the seed corn in this manner is indefensible, particularly when we realize that deploying today's technologies is very unlikely to yield a technology breakthrough of the kind that real energy transformation will ultimately require. While hardly simple or easy, the remedy is relatively obvious. If we agree that we need more federal research on new energy technologies that could supplant conventional energy sources on the basis of superior performance, and not just lower emissions, then we must wean current alternative energy technologies off massive subsidies as quickly as possible--in a few years at most, except for those that have been feeding at the federal trough for decades. Those should begin to be phased out at once. And if we agree that climate change is a serious risk that we must address urgently, then putting a price on the emissions contributing to it would ensure that the companies making and installing today's wind turbines, solar panels, and other technologies would have a decent chance of surviving the elimination of the direct subsidies upon which they depend, but that will eventually squeeze out R&D spending.

Wednesday, April 21, 2010

Earth Day Cold Turkey

I was perusing the 2010 Earth Day website and ran across its online petition in support of comprehensive energy and climate legislation. Aside from the expected references to green jobs, energy independence and solving climate change, I was struck by the tone of the declaration, which called for "sending a powerful message to the polluter lobby: we've had enough of dirty power, the time for change is now." I might have had a different reaction to this on another day, but for some reason it occurred to me today to wonder what would happen if the energy industry immediately capitulated to this demand. What would it mean if every power plant burning coal, oil or natural gas shut down today and remained idle? The short answer is chaos and social collapse, but let's take a quick look at why.

Before getting into the underlying numbers I can't resist the opportunity to point out that this kind of language is an understandable consequence of the decision by the Supreme Court to label greenhouse gases as "pollution." Pollution is inherently awful and emotionally energizing, while emissions are, well, more complicated and nuanced. Unfortunately, while the risks that go with climate change pose very serious problems, the solutions are not nearly as simple as the solutions to the kinds of pollution that we've been accustomed to dealing with under the legislation and regulations that the first Earth Day and its anniversaries helped to trigger. I admit this distinction has become a lost cause, but no one should be surprised by the passion and vitriol concerning greenhouse gas emissions that has resulted from this choice.

So what if we took the Earth Day petition at face value and bypassed the whole decades-long transformation that cap & trade, a national renewable electricity standard, and various other pending emissions regulations would set in motion? What if we simply shut down every fossil-fuel-burning power plant today? After all, we have all these new wind turbines and solar panels--record amounts of which were installed here last year--and we still have thousands of hydroelectric dams and 104 nuclear power plants. Together they produce vast amounts of electricity, and surely with a bit more efficiency we could make do with that, while building more wind, solar and geothermal capacity as fast as possible to keep the economy growing. Well, as it turns out, all renewable sources plus nuclear generated a bit over 1.2 trillion kilowatt-hours (kWh) last year. That's certainly more than the entire electrical output of many other countries. According to data from the Energy Information Agency, it exceeds all the power generated in 2008 in Japan, or in France and Germany combined. Unfortunately, it's also less power than the US has generated in any year since 1966.

On the face of it, that's not a fair comparison. Although we still have plenty of room for improvement--energy efficiency remains one of the most promising sources of emissions reductions available--the US actually uses energy much more efficiently today than it did in '66. One measure of that is energy consumption per dollar of real GDP. On that basis, it takes just half as much energy to produce the same output as it did when "Eleanor Rigby", one of my favorite Beatles songs, debuted. If we adjust for energy:GDP, then 1979, with its net generation of 2.25 trillion kWh, looks like a more appropriate basis of comparison to the economic work that our current zero-emission power output could do. The problem is that the US population has grown by 84 million people since then, and our economy, expressed in constant dollars, is more than twice as big as in '79--even after last year's contraction. It might even be worse than that, because we've electrified a lot of things that were previously run directly by some sort of fuel, so that slashing our power output by 70% wouldn't just cut our economy by half; it would probably force us to choose among some very high-priority uses for power that might not include the PC or other device on which you're reading my words. To say that electricity rates would have to go up dramatically is an understatement, and we haven't even accounted for the intermittency of wind and solar, which can't replace baseload coal power or on-demand gas-fired power.

There's no need to stretch this highly-simplified scenario any farther. Like it or not, we can't yet live without the power we get from all those nasty fuels we're still burning, and the day when we can is not just around the corner. After all, the wind, solar and geothermal power sources we've focused intensely on expanding accounted for just 2% of the electricity we used last year. Double them, and then double them again (10 years?) and that's still only 8%, compared to the 69% we got from fossil-based generation last year. The Senate climate bill that's expected to be released in the next week or so might well move us in the direction that the signers of today's Earth Day petition want, but that evolution can't happen nearly as fast as many of them have been led to expect.

Monday, April 19, 2010

Electric Cars and Natural Gas

Two items in the weekend Wall St. Journal caught my attention. The first concerned the mileage ratings of electric vehicles, with the EPA apparently reconsidering its initial methodology with an eye to making it better reflect reality. The second reported on a meeting of natural gas exporting nations, which seem to be backing away from notions of OPEC-style gas output cuts. While these stories appear entirely unrelated, at least in any cause-and-effect sense, they intersect in interesting ways. That's because natural gas has largely replaced fuel oil as the link between electricity markets and the world of hydrocarbons, while becoming a viable alternative vehicle fuel in its own right. Any shift away from oil-based transportation fuels toward either electric- or natural-gas-powered vehicles could be hindered, if gas prices started to behave like oil prices.

As the article on EV fuel economy reminds us, GM and Nissan made headlines last year with eye-popping mpg estimates for their Volt and Leaf electric vehicles, respectively. However, as I noted at the time, it is simply not realistic to apply a theoretical energy conversion equating the energy in a kilowatt-hour of electricity to the BTUs delivered by a gallon of gasoline without taking into account the means by which it was generated. According to the Journal, Nissan's 367 mpg claim was based on a calculation using 82 kWh/gal. That implies that it takes just 1,414 BTUs to generate each kWh of electricity used by the Leaf. Physics tells us that isn't possible, with 3,412 BTU/kWh as the theoretical minimum and real-world values much higher. Perhaps the earlier methodology reflected assumptions about the fraction of the time the Leaf might be expected to recharge on surplus wind or solar power, for which no fossil fuels are consumed. At this point any such assumptions look premature, at best.

Several years ago, the Pacific Northwest National Laboratory evaluated US power generating capacity to determine the level of EV market penetration that could be accommodated without building more power plants. Their conclusion that 84% of the cars on the road could be electrified without exceeding the capacity of existing power plants surprised a lot of people, and it has been cited many times since--usually without attribution--as evidence that EVs are a practical alternative to imported oil. The aspect of the study's findings that often gets ignored is that the unused capacity available to power EVs came mainly from gas turbines that are used to meet peak power demand and back up the intermittent output of renewables such as wind and solar power, and are thus idle for many hours a day. Yet while wind and solar have both grown substantially since the 2006 PNNL study, their contribution to actual US net generation has still only increased from 0.6% to 1.8% of the total--not enough to alter the conclusion that for the time being any incremental power consumed by EVs will come mainly from natural gas and other fossil fuels.

In that light, realistic fuel economy estimates for EVs must incorporate reasonable estimates of the amount of gas needed to generate each kWh used. Depending on the applicable gas turbine configuration, which would vary by time-of-day and market, that could range from 7,000 to 12,000 BTUs or more. Even if we used a conservative figure of 8,000 BTU/kWh, that means that the amount of natural gas equivalent to one gallon of gasoline (carrying 116,000 BTUs) would generate at most 14.5 kWh of power. If the previous 367 mpg estimate for the Leaf was truly based on an assumption of 82 kWh/gal., then its effective fuel economy might actually be no higher than about 65 mpg. That's still impressive, and it would save a lot of oil, but does it represent enough of an improvement over a Prius-type hybrid--or compared to the Chevrolet Volt, which the Journal cites as getting 50 mpg on its range-extending generator after the initial battery charge has been depleted--to justify the lifestyle constraints of a 100-mile range and recharging times measured in hours? More fundamentally, is this even the best use of the natural gas involved, compared with backing out coal-fired power generation and its high CO2 emissions, or using the gas directly as a vehicle fuel, particularly for trucks and delivery vehicles, as proposed by Mr. Pickens?

While the answer to the latter question is neither trivial nor obvious, all of these options hinge on natural gas being both plentiful and cheap, especially relative to crude oil. You've heard a lot about the impact of the shale gas revolution on gas supply and pricing in North America. Because the US now needs less imported gas to meet demand, and because domestic gas looks plentiful for decades to come, commodity gas on the Gulf Coast now trades for just 1/20th the price of crude oil. That means that the natural gas energy equivalent of a barrel of oil is selling for just $23.50. Even at the roughly $6/MCF indicated for December 2010 gas futures, that's still just $35/bbl. However, the more we rely on gas to generate electricity--to meet incremental demand, including from EVs, and to back out higher-emitting sources like coal--and the more gas we put directly into vehicles, the likelier it is that we'll need to import LNG to balance supply and demand. If the international gas market were controlled by an OPEC-like cartel that was able to constrain output to put pressure on prices, then eventually this would translate into higher gas prices here--closer to crude oil's--and that would make both natural gas vehicles and EVs running on gas-generated power less competitive with fuel-efficient gasoline and diesel cars. So for both EVs and NGVs, it's good news that the gas producers meeting in Algeria seem unlikely to be able to match OPEC's market power any time soon.

Thursday, April 15, 2010

Drilling and Climate Change

For the last week or so I've been scratching my head over news that the Bureau of Land Management, an agency of the Department of Interior, had delayed an oil & gas lease sale in Montana and the Dakotas by at least five months, while it studies how oil field activities contribute to climate change. BLM's press release on the subject mentioned a lawsuit related to a previous lease sale in the area, and I finally got around to tracking down what's behind all this. On one level it might be viewed as an effort by concerned citizens to ensure that BLM complies with the law regarding the environmental impacts associated with its activities, and that oil & gas development takes place in the most efficient and environmentally-responsible manner possible. Yet after reading through the filing in protest of the planned April 13 lease sale from the environmental groups behind the earlier lawsuit, I see a much more worrisome strategy aimed at impeding not just one sale, but the broader development of oil and gas resources that are essential to the energy and economic security of the US.

Let's start by stipulating that when you drill for oil, the various processes involved in producing, transporting, and refining it for use as fuels for vehicles and feedstocks for industry emit greenhouse gases (GHGs). Thus the basic concern behind the objections of the Montana Environmental Information Center, Earthworks Oil & Gas Accountability Project, and WildEarth Guardians is simultaneously validated and rendered as a blinding glimpse of the obvious. It's also true--though you'd never know it from their filing--that the emissions from oil & gas exploration and production represent just a fraction of the lifecycle emissions associated with their use. How large a fraction depends on the specifics of composition, geology and location, but in general roughly 80% of the emissions occur during consumption, with refineries accounting for more of the "upstream" emissions than exploration, production and transportation do. According to Shell's 2008 GHG reporting, that company's emissions from exploration and production averaged 0.11 metric tons of CO2-equivalent (CO2e) per metric ton of production. Using basic conversion factors, that works out to around 0.8 lb of CO2e emitted per gallon, compared with roughly 20.4 lb of CO2e from burning a gallon of gasoline in your car.

In other words, except for emissions-intensive extraction processes like oil sands, the fraction of greenhouse gas emissions attributable to getting oil out of the ground and to a refinery amount to no more than 3-5% of the total lifecycle emissions from petroleum. As a reality check on this I looked up the emissions for Production Field Systems within Petroleum Systems in the most recent EPA Greenhouse Gas Inventory. At 29 million tons per year of CO2e, including the methane emissions that the environmental groups objecting to the April 13th lease sale were so concerned about, this constitutes just 0.5% of total US energy-related emissions. Even if you cut them by half, the impact on global climate change would be negligible, and that's why the effort to derail the Montana and Dakotas lease sales makes me suspect it is aimed at more than just promoting lower-emitting drilling practices. When you include the groups' references to "climate tipping points" and the speculative impact of localized "CO2 domes", what these folks really seem to have in mind is shutting down large portions of US oil & gas production entirely.

The leasing delays these groups have already achieved and the prospect of further delays beyond September, whether in additional BLM reviews or in the courts, are of particular concern, because the leases in question fall mainly within the Williston Basin of Montana, North Dakota and South Dakota. Many of them appear to overlap the Bakken Formation, which the US Geological Survey estimates to contain undiscovered, technically-recoverable oil resources of up to 4 billion barrels. So they're not just holding back a few marginal wells in the middle of nowhere; they're impeding development of a region that, with help from improved technology, is becoming one of the most important domestic oil sources in the onshore lower-48 states--even without all the hype about the Bakken that has been circulating in the blogosphere and the email rumor mill.

BLM must follow the law in terms of evaluating the environmental impact of drilling activities on its oil & gas leases, but that surely does not extend to withholding leases from future sales on the basis that drilling them in any manner will make climate change worse. Whatever the motives of the groups attempting to block the legitimate exploitation of the oil resources of the Dakotas and Montana, they are tackling the climate problem from the wrong end. Impeding US production does little or nothing to reduce end-user consumption, where most of the GHG emissions in the oil and gas value chain occur. So while not reducing emissions in any globally- or even locally-meaningful way, they are making our trade deficit bigger. If successful, they might even inadvertently increase global GHG emissions, depending on where and how the additional oil we must import is produced. I hope that BLM can dispense with these misplaced efforts at obstructionism and reschedule the April 13th lease sale promptly.

Tuesday, April 13, 2010

Fueling Wind's Surge

Last week the American Wind Energy Association (AWEA) released its annual report on the US wind industry. By most measures it reflects another banner year. New wind power installations topped 10,000 Megawatts for the first time, bringing cumulative capacity to just over 35,000 MW, roughly 93% of which was added in the last 10 years. Last year's increase was sufficient to expand US wind generation from 1.3% to 1.8% of total US net power generation, though a fifth of that gain in market share was the result of a 4% decrease in the denominator, due to the recession. The portions of the report that I was able to access provided much food for thought, particularly with regard to the government policies that helped the wind industry overcome the near-paralysis it faced when financial markets froze in late 2008. A crucial contributor to that recovery is now due to expire at the end of the year, and the industry's supporters are already calling for its extension.

Most of AWEA's 2009 report is only available to non-members for a fee. Some of the information I'll be referring to can be found in the media version, but not in the free "teaser" available on AWEA's website. One interesting comparison featured in both versions shows that wind accounted for 39% of new US power generation installed last year, having overtaken all other technologies except for gas-fired generation within the last few years. Coal was a distant third, and "other renewables" lagged far behind, despite the high profile of solar and geothermal energy. It's a toss-up whether wind can overtake gas on new additions anytime soon, considering that wellhead gas prices are running at less than half their level of just two years ago. That ultimately translates into lower levelized electricity costs, particularly compared with wind and other new sources vying for the non-baseload portion of the power market.

One of the other main uncertainties for wind energy concerns the financial and policy environment in which it operates, both of which were changed drastically by the recession and financial crisis. Prior to the demise of Lehman Brothers and the retrenchment of the entire banking sector, many wind projects relied on the "tax equity" market, in which the rights for future tax credit receipts were exchanged for prompt cash. The collapse of that market threatened to bring the US wind industry to a standstill, as developers without sufficient taxable earnings, or lacking the financial strength to wait to receive tax credits once their projects began to sell power, had few options for financing new activities. That would have left turbine manufacturers with unsold inventories or unacceptably risky receivables. Into this void stepped the US Congress with the Renewable Energy Grant program included in the stimulus bill. Importantly, in order to qualify for up-front cash grants in lieu of tax credits, a project must at least have begun construction by the end of 2010. After that, new wind projects would still be entitled to the Production Tax Credit (PTC), or to an Investment Tax Credit in lieu of the PTC, but as before the financial crisis they'd only receive it as a reduction to future income taxes. Two Senators introduced a bill last year to extend the grants for another two years, but it is apparently still in committee.

You might also recall that the grant program was the subject of considerable controversy, when it turned out that most of the money disbursed as of last fall had gone to non-US companies, mainly to purchase non-US wind energy equipment. A review of AWEA's 2009 statistics shows that the situation probably hasn't changed. Of the top 10 wind turbine companies in the US last year, only two, GE and Clipper--now part-owned by United Technologies--were notionally domestic. Together they accounted for just 45% of installations by capacity. The list of top wind farm owners also includes many foreign companies. Spain's Iberdrola and Germany's E.On , both of which were leading recipients of the Treasury grants last year, were in the top 5, along with the US subsidiary of EDP.

My intent here is not to vilify foreign wind companies, most of which either have US manufacturing or source significant portions of their wind turbine supply chains here, and without which US wind installations would slow dramatically. However, the structure of the US wind market does raise serious questions about who stands to gain the most from an extension of this temporary stimulus program beyond the end of 2010. That's especially pertinent, since most of the companies that now dominate US wind on both the manufacturing and project side--including GE, Siemens, UTC, Iberdrola, MidAmerican, and big utilities like FPL, Edison Mission, and Duke--are again strong enough financially to afford to wait for their tax credits as projects are completed and power is actually produced, as intended under the original PTC rules. Moreover, it's not clear how the tax equity market could be expected to revive fully--and the larger renewable energy financing business with it--as long as companies can continue to turn directly to the government for nearly a third of their project costs, paid up front.

This leaves the Congress and administration with two fundamental questions to address. First, as the economy recovers from the worst recession in decades, when do temporary measures to prevent the complete breakdown of business activity during a crisis begin to retard full recovery and become counter-productive? Second, how forcefully should the government be promoting wind power, which has developed into the most competitive and mainstream of our new energy sources? Fairly soon the focus should turn to how we might eventually wean wind power off subsidies, altogether, rather than continuing to accelerate them in a manner more appropriate to less well-developed, less-competitive energy sources. To complicate matters further, this must be considered against the backdrop of climate change policy, which is still very much in flux, and the growing debate over deficits. I'm sure we'll be hearing a lot more about this issue as the wind industry begins to consider projects that couldn't realistically start construction before the year-end deadline for the current grant program.

Friday, April 09, 2010

Delaware Refinery Swims Against the Tide

When I saw this headline in today's Wall St. Journal, "Governor Stays Closure of Delaware Refinery," the first thought that crossed my mind was of King Canute and his order to stop the tide. Valero Energy Corp., which owns the Delaware City refinery, had announced last fall that it would be shut down and dismantled. That was a pretty remarkable turn of events, considering that not very long ago refining margins were at all-time highs, boosting the fortunes of independent refiners like Valero and causing politicians and energy experts to despair that the US didn't have enough refinery capacity to keep pace with future demand. But while I understand the state government's desire to preserve the jobs and tax base involved, it's worth asking whether Governor Markell and the firm that appears ready to buy the refinery for $220 million are making a good bet or merely postponing the inevitable. Two graphs of the key fundamentals for this sort of refinery raise serious doubts.

More than 100 US refineries have closed in the last several decades, but few of those were as large or sophisticated as the Delaware City Plant (DCP), which was originally built by Getty Oil to process heavy oil from the Neutral Zone between Kuwait and Saudi Arabia. My former employer, Texaco, owned it for a while, as a result of its acquisition of Getty, before putting it into its refining and marketing joint venture with Saudi Refining Inc., which later included Shell. That JV sold DCP to Premcor, Inc., an independent refiner then run by the current CEO of PBF Energy Partners, LP, the company that is now buying it from Valero, which has owned it since its purchase of Premcor in 2005. The number of times it changed hands probably says more about the evolution of the US refining industry than about any inherent shortcomings of the facility, which is a complex machine for turning low quality crude into lots of gasoline and other valuable light products. Unfortunately, that description encapsulates the two biggest challenges its new owners, creditors and employees face.

Start with gasoline, which remains the most important product for most US refineries, accounting for about half of all US petroleum product sales and roughly 60% of refinery yield on crude oil input. Historically, US gasoline consumption rose by a steady 1-2% per year, and refineries often struggled to keep pace with demand, resulting in significant imports of gasoline and blending components. Two factors have altered that relationship, perhaps permanently. First, rapidly-increasing ethanol production, backed by subsidies and a steadily-escalating mandate, is eroding the market share of the gasoline that refiners make from crude oil. So now even when "gasoline" sales go up, they include an increasing proportion of ethanol. And as a result of the recession, total gasoline sales--including the ethanol blended in--fell by 3.2% between 2007 and 2008. When you factor out the ethanol, the drop was more than 5%. So because of weak demand and increasing ethanol use, refineries like DCP have experienced a shrinking market for their most important product, as the graph below depicts.

Then there's the issue of refinery complexity, which is a two-edged sword. When both crude and product markets are tight, as they were in 2006 and 2007, complex refineries like DCP enjoy a cost advantage over less sophisticated competitors, because they can make the same products from cheaper, lower-quality crude oils--typically heavier and higher in sulfur and other contaminants. But when the global economy stalled in 2008 and oil demand plummeted, many of those low-quality crude streams were the first ones that producers cut back, because they yielded less profit at the well-head than lighter, sweeter crudes. With less supply, the discount for them relative to lighter crudes shrank, and with it the competitive edge of facilities like DCP. In the case of Saudi Heavy crude, shown below, it looks like that discount was cut in half starting in late 2008, which was probably the last time DCP made decent returns.
What must happen in order for DCP to become a viable proposition in the future, other than for PBF to buy the facility for a fraction of its replacement cost--even less than Mr. O'Malley paid Motiva for it in 2004? Number one would be for light/heavy crude differentials to widen again. That could reasonably be expected to occur when the global economy grows by enough to bump up against OPEC's spare capacity limits, again. With spare capacity currently standing at more than 5 million barrels per day, that's unlikely to happen soon. However, even with a wide enough discount for its preferred crude supply, DCP will still be pushing gasoline into a weak market, thanks at least in part to continued expansion of ethanol. One indication of that comes from Valero's earnings report for the fourth quarter of last year, in which its ethanol business earned operating profits of $94 million, while its refining business, with more than 40 times the throughput, lost $226 million.

I would have been sorry to see the Delaware City Plant, with all its history, sold off for parts and scrap. After all, this is pretty much the kind of refinery that some were hoping the US would build, just a few years ago: large, complex, close to major markets and outside the hurricane belt of the Gulf Coast. However, the world changed in the interim. Will it change back enough to make DCP a going concern, again, or are the taxpayers of Delaware sinking more money into a facility that is destined to be a victim of Peak Demand, as more efficient cars and more prevalent biofuels squeeze enough petroleum products out of the market to ruin the economics of all but the most-efficient, lowest-cost refineries? We should know within a few years.

Wednesday, April 07, 2010

A Framework for Geoengineering

This week's Economist includes coverage of a recent meeting of scientists at Asilomar, in California, to discuss the ground rules for pursuing "geoengineering", the deliberate, large-scale modification of the earth's environment. The purpose of the geoengineering now under consideration is to limit or reverse the effects of climate change, presumably whether man-made or otherwise. This is a notion that provokes great anxiety or outright revulsion on the part of many who feel our only acceptable response to global warming is to return the planet to something approximating its pre-industrial state by eliminating the emissions and land-use changes that have accumulated over the last century or more. However, for those of us who doubt either the efficacy or achievability of such drastic changes in the economy and our lifestyles, geoengineering is at least a legitimate, complementary option along with mitigation, and potentially our last hope of averting a worst-case climate scenario, should one arise.

Anyone who is convinced of the dangers of global warming or climate change, whichever you prefer, implicitly accepts the potential of geoengineering, because anthropogenic climate change (AGW) ultimately amounts to an uncontrolled experiment in geoengineering on a global scale. The kinds of experiments proposed by researchers meeting at Asilomar--the site of other notable, long-view discussions in the past--would operate on a much smaller scale, at least initially, with the goal of either undoing or holding temporarily in abeyance the changes resulting from humanity's emissions of heat-trapping gases in excess of the capacity of the earth's massive natural GHG-recycling facilities to absorb. For that matter, geoengineering might even be useful if it turned out that AGW was only one of several factors combining to shift conditions away from the benevolent state that has supported humanity's rise as the dominant species on the planet.

This is an issue that I've been following for a long time, though I haven't written about it very often here. My interest in geoengineering was piqued in the 1990s by proposals to sequester large quantities of CO2 in the oceans by stimulating plankton growth where there naturally wasn't much. That's only one of many possible approaches that fall into a broad family of carbon-removal strategies constituting one of the two main geoengineering categories The Economist considered. "Solar Radiation Management", the other category, includes strategies for reducing the amount of solar energy the earth receives or retains. That could run to putting large numbers of small particles in the upper atmosphere or orbiting giant mirrors to deflect sunlight off into space. It might even be as simple as painting all rooftops white--a bit of a problem if they're all covered with dark solar panels.

The basic problem seems to be convincing everyone potentially affected--which of course might include everyone on earth, or at least their representatives--to trust researchers to keep the impact of their experiments strictly limited and under tight control. The session at Asilomar apparently endorsed a set of steps called the "Oxford Principles", which describe five key elements for gaining concurrence:

1. Geoengineering to be regulated as a public good.

2. Public participation in geoengineering decision-making.

3. Disclosure of geoengineering research and open publication of results.

4. Independent assessment of impacts.

5. Governance before deployment.

Now, these sound pretty good as a set of basic principles, particularly if your goal as a researcher, or as the institution or nation funding the research, is to get everyone onboard before you start. Among other things that might avoid having someone turn up later to accuse you of making things worse, at least locally. Geoengineering liability is a serious concern at the individual and institutional level, and it could extend to being considered an act of war at the national level, if things turned out really badly. Unfortunately, when I consider how these principles might actually work--including stifling the involvement of for-profit companies in either the funding or actual R&D role--I believe they describe a likely path to doing nothing. Imagine having tried to get the delegates at Copenhagen to agree to let someone put finely-divided salt particles into the atmosphere over, say, the Arctic, to make clouds more reflective. Might as well have tried to sell them the Brooklyn Bridge at the same time.

That's the core of the problem as I see it: If we do end up needing to deploy geoengineering, it's likely to be precisely because we were unable to get every country on earth--or even just the small subset of large emitters--on the same page with regard to climate change, let alone establish a universally-trusted body to oversee their mitigation efforts. If we yoke geoengineering to the same UNFCCC/IPCC process that brought us the Copenhagen Climate Conference and the Kyoto Protocol, then we might as well forget it and try to figure out where to invest in the likely new beachfront property of the 2050s. In any case, as appealing as the Oxford Principles might seem from a stakeholder-engagement perspective for implementing large-scale geoengineering someday in the future, they look too unwieldy to guide the small-scale R&D efforts that would be needed to determine which, if any, of these schemes actually have merit.

One possible alternative would start with the same concept of climate forcing that underpins today's climate models. (And by the way, any serious geoengineering effort is going to require really good, trustworthy global and regional climate models, the inherent limitations of which are one of the main complaints of climate skeptics.) The observed increases in CO2 and other greenhouse gases equate to roughly an extra 2 watts per square meter of heat radiation retained by the earth, out of a total average influx of around 240 w/m2 at the earth's surface. So if 1% more radiation/retention is enough to cause the global warming we have observed, then what is the maximum equivalent level of geoengineering testing we'd be willing to tolerate to see whether any of these techniques might help? 0.01%, or 1/100th of the scale of the problem itself? And what would be the most any one experiment should be allowed to fiddle with? 0.0001%, or one part per million, allowing at least 100 small experiments under the overall limit? (For experiments dealing with carbon-removal, rather than radiation management, this forcing threshold could easily be converted to its tons-per-year of CO2 equivalent.) Whatever the level, the idea would be to keep any individual experiment, and all of them together, below the level at which they could make things noticeably worse by accident--with a healthy margin for error--without preventing any work from being done on this at all.

Some regard geoengineering as yet another outgrowth of our technological hubris and thus unworthy of further research. While I respect anyone's right to that view, I would also question their commitment to the survival of the human race. That's because I'm deeply skeptical that our current approach to climate change can work fast enough and on the necessary scale to avert the worst outcomes scientists suggest we face. We already live in a geoengineered world that couldn't support a fraction of its current population if we returned it all to its natural, pre-industrial state. That's not a license for unlimited tinkering with our environment, and perhaps that's the underlying concern: that the same techniques that might be applied to reduce the impact of climate change might eventually be employed in risky attempts to fine-tune an even more optimal climate than the one we inherited. Science is like that, as demonstrated by nuclear proliferation and questionable medical practices. But while I share those misgivings with respect to the potential misuse of geoengineering, I sure want us to have some of these options in our hip pocket if we ever really need them.

Monday, April 05, 2010

Mustangs and CAFE Standards

Over the weekend a review of Ford's new 6-cylinder Mustang in the Wall St. Journal included an interesting perspective on the contribution of stricter Corporate Average Fuel Economy (CAFE) standards to the production of a car that provides both better fuel economy and more horsepower than the preceding model, in the absence of market incentives like higher fuel prices or taxes. While I have some quibbles with the reviewer's interpretation of the sequence of events involved, he does clarify the choice we've made in pursuing vehicle efficiency gains through a mainly regulatory, rather than a more market-based route. That choice implicitly trades off obvious costs at the gas pump for hidden ones in the sticker prices of new cars, while providing nearly unlimited scope for tampering to promote specific, favored technologies, as exemplified in the joint EPA and Department of Transportation CAFE and tailpipe emissions rules that were finalized last week.

The review in question concerned the 2011 Mustang equipped with a Duratec V-6 engine developing 305 horsepower but still managing a respectable 31 highway miles per gallon, a 29% improvement over the current V-6 model and a nearly 35% improvement over the current base V-8 with which the performance of the new, more powerful six might reasonably be compared. With its 19 mpg in city driving, the effective overall 24 mpg of the new model hardly puts it into competition with efficiency leaders like the Prius or Ford's own 39 mpg Fusion hybrid, but then I'm not sure how much time the typical Mustang buyer would spend looking at such cars, even if they achieved 100 mpg. More importantly, the most cost-effective fuel savings--and thus reductions in both oil imports and greenhouse gas emissions--will for some time come from improving the fuel economy of ordinary, non-hybrid cars. Consider that the new Mustang will save the average driver 130 gallons of gasoline a year compared to the old one. Buying a hybrid Fusion instead of the regular 4-cylinder Fusion saves only 40 more gallons per year than that, though at an extra cost of at least $3,295 on the sticker price.

It's debatable whether Ford would have produced a car like the 2011 V-6 Mustang without the tougher CAFE standards set by the US Congress in late 2007 and just finalized this April 1st. While the Wall St. Journal's new car reviewer sees clear cause-and-effect and wishes to "raise a cheer for government fuel economy regulations," I can't help wondering about the impact of gasoline price volatility during the product design cycle of this car. The last time I took a serious look at the subject, car companies spent three to four years creating a new model or major redesign of an existing model, tooling up to implement it, and then starting production. In 2007 US retail gasoline prices averaged $2.84/gallon and were coming off the first-ever summer in which monthly-average prices broke the $3.00 mark and on their way to $4.00 just a year later. I see as much causality in the arrival three years later of a 31 mpg Mustang as in the much less fortuitous arrival in 2007 and 2008 of various big SUVs and pickups that would have been designed in 2004-5, when gas prices averaged $1.89 and $2.31, respectively. Although I'm sure that the impending changes in CAFE standards influenced Ford's design department to develop products like the Fusion hybrid and the new Mustang, there's also good reason to suspect that Ford responded to changing fuel prices in much the same way that consumers did, albeit with an inherent lag of several years.

As long as it remains politically suicidal to take steps to increase fuel prices and provide consumers and carmakers with some certainty that they will remain high, we can't rely on a volatile fuel market to provide consistent signals favoring higher fuel economy. There are also solid arguments for holding down fuel taxes, unless their revenues are dedicated to improved highway maintenance or returned to taxpayers via rebates or breaks on other taxes. In the absence of higher gas taxes, however, the main policy levers available for reducing national fuel consumption are high taxes on gas guzzling cars, such as those levied on engine displacement in the UK and elsewhere in Europe, or the CAFE pathway the US has followed since the 1970s--and that unintentionally helped spawn the entire SUV fad through its infamous "SUV loophole."

In its latest incarnation CAFE treats SUVs less generously but still provides manufacturers with credits for producing flexible fuel vehicles capable of burning E85 that consumers don't seem to want, by letting carmakers count them as though they used E85 half the time--1% is more like it--and then only counting the 15% gasoline content of the E85 consumed for that half. The new CAFE also treats plug-in electric vehicles as though they consume no energy at all and somehow displace two non-electric cars each. While the latter distortion might not turn out as badly as the SUV loophole, these rules--along with hefty EV subsidies for consumers--are certainly going to push carmakers in the direction of making a smaller number of full EVs at the expense of a much larger number of non-plug-in hybrids, or even modestly improved cars such as the new Mustang, which must have required a considerable investment in technology and production retooling. Stacking the deck in that manner looks like a very expensive way to reduce greenhouse gas emissions, compared to other options. I'd much rather have seen a simpler set of rules--spelled out in many fewer than 837 pages--that established the required mpg and emissions outcomes by year and left it to carmakers and consumers to work out how to achieve them.

It's easy to forget how much the fuel economy of comparable cars has improved during my lifetime. The Mustang review caught my eye because my first car was a used '65, a quintessential baby boomer car that defined its entire category. Yet even when driven conservatively, the best I could eke out of mine was about 14 mpg, and 12 wasn't an unusual result. You can run two of this year's model on the quantity of fuel my '65 consumed, and in considerably greater comfort and with about 1% of the non-greenhouse emissions. How much of that improvement should be attributed to CAFE standards, the general advance of technology over the intervening years, or because fuel prices have finally surpassed the inflation-equivalent of the $0.60/gal. or so that I was paying when I bought my first car?

Thursday, April 01, 2010

Half Full and Half Empty?

Yesterday's announcement by President Obama that his administration would allow new offshore drilling on selected portions of the Outer Continental Shelf (OCS) that had formerly been off-limits yielded a variety of reactions. Energy industry leaders were cautiously optimistic, environmentalists were disappointed or "outraged", and the Washington Post's print-edition headline called it a "political maneuver." From my perspective, it constitutes a welcome concession to the reality that the day when renewable energy sources can pick up the entire load now carried by fossil fuels is a long way off--decades, not just years--and that until then we still have some important levers to pull in minimizing the amount of foreign oil we must import. Yet however it plays in the Congressional dance to devise a "comprehensive energy bill"--the current terminology for describing legislation regulating greenhouse gas emissions--it clearly falls short of what would be required to put the medium-term energy needs of the country on a truly secure footing.

On the positive side, yesterday's announcement sets the stage for oil producers finally to gain access to offshore acreage that had been off-limits for decades as a result of a combination of Congressional and Executive drilling moratoria. So while it does not strictly speaking open up these areas for drilling--that happened in 2008 when the previous bans expired or were lifted--the President made it clear that he will not reinstate a ban for the Atlantic coast south of New Jersey or for the Chukchi and Beaufort Seas off Alaska. If you are concerned about the energy security of this country and the enormous sums we pay to import oil from abroad, that is good news, even if it will take years to go through the process that Interior Secretary Salazar has outlined.

As usual the traditional media has gauged the potential resources involved with its customary lack of insight into how oil & gas are produced in the real world, comparing them to a few years of total US consumption. The subtext here is clear: how much should we risk for a couple more years' supply of a depleting resource? The reality is quite different. Even at the low end of 39 billion barrels of recoverable oil cited by Secretary Salazar, the new zones could eventually contribute several million bbl/day for a couple of decades. If ramped up quickly enough, that could overcome the underlying decline rate of current US output and add significant net production for a decade or two, at a time when competition for the oil we are currently importing is likely to be fiercest: as the growth of Asia continues and the domestic energy needs of exporting countries skyrocket, but before renewables, conservation and vehicle electrification can achieve their full impact.

Perspective is crucial in situations like this, so let's start with some figures already familiar to my regular readers. If 39-63 billion barrels of oil doesn't sound like much compared to the vast energy appetite of the US, which even in last year's recession-dampened economy consumed 18.7 million bbl/day of oil, or when compared to the enormous reserves of the Middle East, consider that cumulative US oil production stands at around 200 billion barrels from reserves that at no point exceeded 39 billion barrels. If that sounds like a contradiction, it's because the industry has always found more oil and more ways to extract it than expected when the resources were first discovered. There is no reason to believe that won't still hold true, particularly compared to resource estimates based on technology that was current when PCs running on Intel's 286 chip were cutting-edge and cellphones were scarce and looked like bricks.

It's also worth thinking about the prospect of an extra couple of million barrels per day of domestic oil in the context of how much renewable energy we'd have to produce to provide a similar quantity of energy. Wind turbines and solar panels don't even enter into this discussion, because they do not displace any meaningful quantity of oil. That's because they produce electricity, and last year oil accounted for less than 1% of all the electricity generated in the US. On an energy-equivalent basis, each million barrels per day of additional oil production equates to the energy content of 27.9 billion gallons per year of ethanol, or more than 2.5 times last year's record US ethanol production. In terms of useful energy contributed after accounting for the energy used to produce it, that comparison grows to more like 5x: the equivalent benefit of more than 50 billion gallons per year of ethanol, or about half-again the ultimate contribution of the entire 36 billion gallon federal Renewable Fuel Standard. And even if we threw away everything but the gasoline yield from this oil, it would still displace as much imported energy as 40 million plug-in electric vehicles--for which we'd still need to come up with an electricity source.

So if there's so much potential in the areas that the President has offered up for drilling, why would anyone be disappointed or see this as a glass half empty? For starters, it imposes new drilling bans on the entire Pacific Coast and carves out of the eastern Gulf of Mexico some of the most prospective acreage closer to the Florida coast, where large natural gas deposits have already been found. And of course it doesn't even mention the Arctic National Wildlife Refuge, which the USGS estimated to contain another 10 billion barrels, give or take a few billion. Simply put, outside of the Gulf of Mexico more acreage will again be placed off-limits than will be made available for drilling, and even the expansion into the eastern Gulf will require the approval of a Congress that has not looked favorably on drilling there since it placed its own ban on that region in 2006. My disappointment at those limitations is mitigated by the knowledge that drilling there now would be a non-starter, politically. Better to begin where state and local governments are willing and some even eager. Closer to home for me, it appears that Secretary Salazar is postponing the bidding on the Lease Sale 220 area off Virginia that I blogged about a couple weeks ago from 2011 into 2012, holding up lease revenues my state badly needs to plug serious budget gaps. (This would also require Congressional approval of revenue-sharing for these bids and royalties, similar to what the Gulf Coast states currently enjoy.)

In his comments at Andrews Air Force Base President Obama made it clear that additional offshore drilling must be viewed in the context of a broader plan for addressing US energy needs. Yet because of the structure of our energy economy and the enormous relative impact of additional oil production compared to renewables at their current scale, only massive fuel economy improvements and conservation can contribute as much to reducing US oil imports, which even after last year's big drop still averaged 9.7 million bbl/day and cost approximately $210 billion. Opening up more of the OCS, which lies beyond visible range from the nation's shoreline, is a good step forward, and it is one that future administrations of both parties can build on.