kw: analysis, light, technology
I had a memory freeze yesterday while using a wind-up flashlight I have that uses LED's. I also have a hand-squeeze generator flashlight that uses a tiny incandescent bulb, but it has to be squeezed constantly to make any light. The LED one is would up for a half minute or a minute, then it works for about ten minutes.
I am glad for at least one kind of technological progress that makes better use of energy. I tracked down the luminous efficacy Wikipedia article while searching "lumens per watt", which contained the figures I needed to see historical progress, and speculate on the future.
First, a Lumen is a measure of the effective brightness of a light source. Its definition includes the spectral sensitivity of the human eye, which is at a maximum at a wavelength of 555nm, a yellowish-green. An ideal source, one that produced only 555nm photons, and made them with 100% efficiency, would have a luminous efficacy of 683 lumens per watt. By contrast, that now-obsolete 100 watt incandescent bulb that produces 1,400 lumens is producing 14 lumens per watt (lm/w), a total efficiency of just over 2%. Older office fluorescent tubes are 2.5 times as efficient, which is why the "standard" fluorescent tube has been the 40 watt size. Newer fluorescent tubes and compact fluorescent lamps (CFL's) are 5-6 times as efficient as the 100W bulb.
But I prefer a different standard of efficiency. The most efficient lamp available so far is the high-pressure sodium arc lamp, at 150 lm/w, or 22% total efficiency. But its strong yellow narrow-band spectrum produces a very ugly look to a scene, which is why they are only used for street lighting. The eye prefers broad-spectrum light, preferably full-spectrum white, that fills the 400-700nm visibility window. The most efficient possible "white", which contains only photons in the 400-700nm range, would yield 251 lumens per watt. That is 37% of the efficiency of a 555nm monochromatic source, but would be a lot easier on the eyes. Let us treat this source as the 100% standard in the discussion that follows.
By this standard, the old 100W incandescent bulb is 5.6% efficient. Although about 7% of the energy goes into visible photons, most of those are red, orange and some yellow, with very little being green and blue, so the overall efficiency is less than if it somehow produced an "equal energy white" spectrum. Incandescent technology got a small boost from the development of Halogen bulbs, which are filled not with vacuum (hmmm, kind of an oxymoron, that) but with argon and a little iodine. This allows the lamp to have a useful life while burning a little hotter, and the bulbs have 19 lm/w, an efficiency of 7.6%. That is almost 1.4 times as efficient as a bare 100W bulb. Now for some history.
The first light-producing technology was the campfire, soon followed by the candle. Both kinds of light source, converting BTU's to watts, produce about 0.3 lm/w, an efficiency near 0.1%. Two discoveries improved upon this, after the year 1800. First, the discovery of acetylene in the 1830s was followed within a generation by the development of gaslight for houses. It is three times as efficient as a candle or kerosene lamp. Then the gas mantle was developed in the 1880s, which is twice as efficient yet. Until Edison came along with the first carbon filament bulb (marginally more efficient), that was it. Tungsten began to be used for filaments in the 1920s, and the "early modern" incandescent bulb was on its way. Though fluorescent tubes were used in offices (similar time frame), they never became popular in homes because of their tendency to flicker. We just lived with lamps that wasted 94+% of their power.
CFL's of various shapes have about a 20-year history, in practical terms. It was about 1990 that they burgeoned into widespread use. They vary from 4-5 times as efficient as 100W incandescents. I have very few tungsten bulbs left in my house.
It is six years since I bought my first LED flashlight. It is a police-sized model with four D cells powering it, and 15 LED's. It is brighter than my old 2W flashlight, and the same four cells are still in it. The LED's are interesting. "White" LED's utilize a blue LED and a yellow phosphor that is efficiently excited by the blue wavelength. The yellow, consisting of moderately broadband red and green phosphors, mixes with the blue to produce a blue-white light. Newer screw-in LED bulbs in sizes from 4W to 8W, at prices of $70 or so, use the same technology.
There is a built-in inefficiency here, which I hope someone addresses. The conversion of blue to red and green entails a loss of about half the energy in the original blue light. I'd like to see a LED source that contained five or six LED's (or a multiple thereof), producing five or six wavelengths spread through the 400-700nm range. In other words, don't convert any photons, but produce only photons you are going to use. I suspect a nicely white lamp produced this way would have nearly twice the efficiency of the current LED's, perhaps 120-140 lm/w. This is in the range of 10x as efficient as a 100W bulb. How 'bout that? Light up a room using only 8-10 watts!
Then there is a further refinement. Just as most of the light from the 100W bulb is red and orange, the light from an LED source could be adjusted, but with a peak in the yellow-green instead of in the red. Such a "modulated" source might approach or exceed 150 lm/w, an efficiency of 60% or better. That is probably close to the ultimate that can be achieved, for a white-looking light. I can hardly wait.
Monday, July 19, 2010
Make only photons you are going to use
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment