Researchers at North Carolina State University have discovered a way to crack hydrogen from water using heat that takes about half the energy of the previous method. The reason? Defective carbon nanotubes. The research is to be published in tomorrow's Physical Review Letters.
The team, led by physicist Dr. Marco Buongiorno-Nardelli, found that naturally-occuring defects in carbon nanotubes could increase the rate of certain chemical reactions because the atoms forming the tubes are essentially "incomplete," making them more reactive.
Because of this, a temperature-based method of cracking hydrogen from water, which normally requires the water to be heated to 2,000° C, can take place at significantly lower temperatures.
“We studied water for many months and ran many different calculations, and we ended up showing that if you want to break a water molecule, you spend a lot less energy if you do it on this defective carbon material than if you do it by simply heating the molecule until it breaks,” Buongiorno-Nardelli said. “You can reduce the energy necessary by a factor of two – you can do it at less than 1,000 degrees.”
As is typical for such breakthroughs, what's demonstrated in the lab may not ever make it to real-world use. The Buongiorno-Nardelli method, although requiring less energy, cannot yet be done in commercially viable quantities. The next step for the NCSU team is to collaborate with engineers working with nanoscale devices to design and build "nanoscale reactors" to allow cost-effective hydrogen production. In principle, however, this method would make it possible to crack water efficiently without needing the extreme temperatures; high-temperature hydrogen extraction is sometimes used as a justification for more construction of nuclear reactors.
Okay, all together: "Oh carbon nanotube, is there nothing you cannot do?"
(Via Green Car Congress)
Comments (4)
"In principle, however, this method would make it possible to crack water efficiently without needing the extreme temperatures"
Uh...Jamais...1,000 C seems pretty extreme as far as a temperature goes. I think as you said in the beginning, it's better because of the lesser energy needed to do it. I may be wrong; is there anything substantially different between the 2,000C and 1,000C temps? For example, the equipment used to do it?
Posted by erik ehlert | September 30, 2005 2:49 PM
Posted on September 30, 2005 14:49
Yes. 2000° is most readily achieved using nuclear reactors; 1000° is within the capabilities of other kinds of heat sources (e.g., some fuel cells get up to the 800°-900° level during operation).
Posted by Jamais Cascio | September 30, 2005 3:00 PM
Posted on September 30, 2005 15:00
Lets put it this way... the difference in cost of the equipment is about the same as the difference between an m1 main battle tank and a shwinn.
Posted by wintermane | September 30, 2005 3:19 PM
Posted on September 30, 2005 15:19
Did they demonstrate the effect in experiment - or is this just a computer modeling "result"?
Posted by Sergei | October 1, 2005 9:02 AM
Posted on October 1, 2005 09:02