Ever since reading about a nifty invention a couple of weeks ago in MIT's Technology Review, I've been thinking about its implications. An MIT physicist has come up with a practical way to recharge laptops, cellphones, digital cameras, etc. without plugging them in. Wireless is certainly all the rage in communications, so why not for power transmission, at least for some applications? After all, carrying around all those plugs and adapters is inconvenient, and inadvertently walking out of the house with an uncharged device is annoying or worse. There's no free lunch, however, and this piece of progress comes at a fairly hefty cost, with energy losses of 50% or more. As much as I love the idea of wireless power, I have a hard time justifying the price of this extra convenience, when we are trying to save energy, rather than wasting it.
As the TR article suggests, the basic concept isn't new. Scientists have talked about broadcasting power ever since Tesla, and long-distance microwave power transmission is central to the possibility of space solar power. For that matter, there are already industrial applications of a similar concept. But Dr. Soljačić's approach to inductive recharging seems novel, with the power being transferred by magnetic fields over a distance of a few meters, but interacting only with devices attuned to receive it. Ignoring the extensive product safety testing that would have to take place before anything like this could be installed in homes or businesses--if cellphones emitting milliwatts concern us, immersing ourselves in a soup of tens or hundreds of watts ought to have us jumping out of our skins--we need to ask ourselves about the concrete benefits this technology could bring.
If all that wireless power offers is to replace something we can already do perfectly well with wires, is it worth the energy loss to gain a little extra flexibility? With energy efficiency experts campaigning to stamp out "vampires"--those devices in our homes and offices that are on even when they seem to be off--how can it make sense to add another 50% loss to the end of the transmission chain, cutting end-to-end energy efficiencies in half? Perhaps an analogy to the early days of the internet is appropriate. Some of the first applications of the web also took things we could do perfectly well in other media and simply adapted them. The real value of the web has come from applications that wouldn't have been possible otherwise, especially involving interactivity and immediacy.
What could wireless power enable us to do that we can't do now? Most of the applications that occur to me off-hand are still in the realm of science fiction, such as "smart dust" surveillance/communications networks, or very large ultra-thin displays. On a larger scale, wireless power might overcome the range limitations that have prevented battery cars from going mass-market. That could take the idea of the plug-in hybrid car one step farther, by providing hundreds of miles of gasoline-free driving range on special highways offering inductive recharging.
One of the biggest challenges with this sort of technology is that the truly unique, compelling applications for it typically don't turn up until after the enabling technology is in place. That makes it hard to justify investing in the initial innovation, running the risk that there might never be such a killer ap. In that case we could end up wasting megawatts of electricity and generating megatons of extra greenhouse gas emissions just to eliminate a few pesky power cords. I'd love to see what some really creative folks could do with this idea.