Physicists have known for decades that, in principle, a semiconductor device can emit more light power than it consumes electrically. Experiments published in Physical Review Letters finally demonstrate this in practice, though at a small scale.
The energy absorbed by an electron as it traverses a light-emitting diode is equal to its charge times the applied voltage. But if the electron produces light, the emitted photon energy, which is determined by the semiconductor band gap, can be much larger. Usually, however, most electrons create no photon, so the average light power is less than the electrical power consumed. Researchers aiming to increase the power efficiency have generally tried to boost the number of photons per electron. But Parthiban Santhanam and co-workers from the Massachusetts Institute of Technology in Cambridge took a gentler approach, achieving power enhancement even though less than one electron in a thousand produced a photon.
The researchers chose a light-emitting diode with a small band gap, and applied such small voltages that it acted like a normal resistor. With each halving of the voltage, they reduced the electrical power by a factor of , even though the number of electrons, and thus the light power emitted, dropped by only a factor of . Decreasing the input power to picowatts, the team detected nearly picowatts of emitted light. The extra energy comes from lattice vibrations, so the device should be cooled slightly, as occurs in thermoelectric coolers.
These initial results provide too little light for most applications. However, heating the light emitters increases their output power and efficiency, meaning they are like thermodynamic heat engines, except they come with the fast electrical control of modern semiconductor devices. – Don Monroe