Last week, while most Americans were preparing for Thanksgiving, many of the world's leading countries were signing a long-awaited international agreement for an advanced nuclear fusion research program, aimed at producing a commercial fusion reactor within the next half-century. In describing this announcement, the Financial Times cited concerns from environmentalists that the dream of clean (essentially carbon-free, low-radiation) energy from fusion could distract us from the need to address the greenhouse gas emissions from conventional power plants. I suppose I can understand the logic that worries them: climate change is a long-term problem, and fusion is a long-term energy solution, so they might superficially seem well matched. However, that notion doesn't withstand scrutiny, either as a potential advantage of fusion or a drawback of funding this research.
Despite decades of research, we're still a long way from replicating the proton-proton fusion reaction that powers the sun. Fusion researchers have focused on two easier reactions involving heavier forms of hydrogen, the deuterium-deuterium reaction and the deuterium-tritium reaction. The facility formerly known as the International Thermonuclear Experimental Reactor, which still bears the resulting ITER acronym, will employ the latter reaction. While requiring less input energy, it still emits neutrons. So although the radiation from the ITER will be much lower than from a conventional fission power plant, it won't be zero. That will complicate the eventual permitting for any reactor designs that come out of this effort.
Fusion power has always seemed thirty or forty years away, since at least the late 1960s. Producing commercial power from fusion is highly challenging, because it requires solving fundamental problems of science and engineering, not just working the bugs out of existing technology. The nature of these problems makes exact prediction of a breakthrough impossible, as well as rendering the subsequent development timeline highly uncertain. But even if we assume that a commercial fusion plant design will be available by 2045, it might take a further 30-40 years for fusion to displace all coal-fired power plants, and their associated emissions. That's hardly a panacea for global warming.
To make matters worse, climate change involves a high degree of inertia. The carbon dioxide going into the atmosphere today will stay there for a century or more. The recent report from the UK government suggested that atmospheric CO2 concentrations are likely to rise to 550 ppm --double their pre-industrial level--by as early as 2035, without drastic action to mitigate emissions. (While I recently expressed some concerns about the Stern Report's longer-term conclusions, this prediction seems pretty solid.) That means that before nuclear fusion can even become a viable energy option to compete with solar power, wind power, or fission--let alone coal--we will already have emitted enough greenhouse gases to cause serious problems for our great, great grandchildren. In other words, if we're going to solve our climate problem, we must be well on the way before fusion can become practical.
So why bother with it at all? Well, regardless of your views on Peak Oil, fossil fuels are ultimately finite and come with a lot of environmental baggage. While renewable energy sources such as biofuel, wind and solar have tremendous potential, I don't see them replacing the entire energy pie, unless it's a much smaller pie than today's--not just per capita, but in absolute terms. Fusion and space-based solar power constitute our two most promising long-term, large-scale energy options that don't bump into significant constraints from land use, water availability, soil depletion, or aesthetics. Pursuing fusion power ought to be as high a priority as possible, given its very long gestation period and slow progress, irrespective of our concerns about climate change.