I'm still catching up on articles I missed during my recent vacation. A pair of them from MIT's Technology Review caught my attention, because they seemed to contradict each other. In this month's briefing on low-carbon electricity technologies, TR concluded that while the cost of solar power has declined significantly, it remains too expensive to compete with electricity from fossil fuels. What's needed, they indicate, is better technology. And yet in another article earlier in August TR reported that industry experts at a recent symposium argued that current solar technology had already achieved the necessary cost reductions to compete with conventional energy, and would become more competitive as it scales up. If even MIT's signature technology journal can't agree who's right on this point, how in the world should policy makers decide whether the priority for solar should be further R&D or deployment of the technology we've already got?
There's little debate that solar power is one of the most promising energy options available to us, at least for eventually replacing much of the electricity we currently generate from fossil fuels. The basic science has been well-understood for a long time, either in terms of photovoltaic cells that produce power directly, or the use of concentrated solar radiation to generate steam for power. Unlike energy technologies such as biofuels from cellulose or algae, we don't have to wonder whether we can ever harness solar at a useful scale. Notwithstanding the serious challenges of transmission, distribution and storage, we know that if we covered a modest fraction of the surface area of the US with solar panels or concentrators, they could generate as much electricity as we currently consume, in contrast to the 2/100ths of a percent that it contributed last year. So while there's still plenty of room to improve the technology for turning sunlight into electricity, the main obstacle we encounter is cost. If solar isn't quite cheap enough today, could merely scaling up the existing technologies make it truly cost-competitive with power from coal and natural gas?
Answering that question is complicated by the way we typically compare different power generating technologies on the basis of their "capacity" costs--what it costs to manufacture and install or construct them. For many years the solar industry has pursued a goal based on reducing the manufacturing cost of a solar module below $1 per peak Watt, which would roughly match the installed cost of a gas turbine power plant and come in around half the cost of a coal-fired power plant. Last year a company called First Solar announced that it had reached that milestone. Unfortunately, however, module costs are only half the story. A solar installation requires more than the bare solar module, which converts sunlight into DC power. In fact, a recent study by the Lawrence Berkeley National Laboratory showed that in the last decade non-module costs had declined from around $6 per Watt to just under $4. So even if we extrapolate that trend to $3/W, the installed cost of the industry-leading solar technology would still be around $4/W, and many of the utility-scale solar projects I've been reading about come in around $5-6/W, a level that is far higher than the cost of a natural gas turbine.
Of course gas turbines have a big hidden cost, too, in the form of a perpetual fuel requirement. If you do the math, though, even with fuel cost included a gas turbine runs around half the cost of currently-deployed utility-scale solar power. In order to calculate this, you must make an assumption about how many hours per day the turbine will operate. For the purposes of an apples-to-apples comparison, I chose six hours, which is roughly the number of peak-sun-hours that a solar array would get in a prime location in the Southwest. At a conservative heat rate of 10,000 BTU/kWh, the comparable fuel consumption for each Watt over 20 years would be around 440,000 BTUs. That sounds like a lot, but at recent natural gas prices it would cost around $2. Add another buck for the capacity cost, and we're under $3/W on an undiscounted basis. (Assume that future gas prices inflate at the discount rate, and the NPV would match this figure.) Even adding a $20/ton charge for CO2 emissions would only bring that up by about $0.50/W, based on average emissions for gas-fired power plants. That ignores maintenance and other costs, but then I've ignored solar array maintenance and the gradual deterioration of solar cell output, as well.
In this simple comparison, at least, it appears that today's best solar technology is still somewhat more expensive than the fossil-based power it's likely to be displacing in a typical power grid, while most of the solar arrays now being installed reflect costs at least 40% higher than gas turbines, even after accounting for fuel and CO2 emissions. I'm skeptical that simple economies of scale beyond those already achieved could deliver that kind of improvement any time soon. That might explain the necessity for a 30% federal tax credit or grant on solar installations, along with generous state-level incentives and renewable portfolio standards--mandates on utilities for a targeted level of renewable power. Absent these, much of today's solar activity would probably grind to a halt.
In this light, answering the question we started with requires defining the basis on which we expect solar power to compete in the future. If we're satisfied with needing to apply a combination of incentives and utility mandates more or less indefinitely, in order to achieve the desired level of solar power deployment, then the current technology and its incremental evolution might be perfectly adequate to the task. If, on the other hand, we'd prefer to see solar and other renewables weaned off these subsidies and able to compete on a truly level playing field with conventional energy sources--after adjusting for emissions at market prices--then it looks like a lot more R&D is called for.
No comments:
Post a Comment