A flickering light will appear to be less bright than if it were
steady. This is because the eye takes the average between the on and
the off cycles. There is an opposing theory that says this averaging is
not perfectly linear (ie the on-cycle is not given the same weighting
as the off-cycle), but for most practical purposes we can take it to
be linear. So, at 50% duty cycle you will get 50% brightness, 50% power
consumption and 50% thermal heating. Everything scales by the same
ratio. You don't gain anything.
In highly nonlinear devices, where the light vs current is not a
straight line, such as laser diodes, PWM can actually increase the
brightness. But LED's are fairly linear devices so this effect does not
come into play.
In addition, pulsing a high-current LED may introduce other problems. 1
Amp turning on and off will generate all kinds of noise you may not
want in an aircraft.
wrote:
Hi Andrew,
The way i understand it is that you get the same brightness with PWM but
in bursts. Your still pulsing at 500ma which gives the same brightness.
The eye/brain won't see the pulses, but the led is only on "X" percent
of full duty, but appears to be constant.
Kind of like florescent light which flashes at 60 hertz.
Are you increasing efficiency? Maybe not, but the duty cycle of the LED
is less which may allow the use of a heat sink without the use of a fan.
Is my understanding correct?
It's a very nice arrangement anyways!
Thanks for the ideas.
Andrew Sarangan wrote:
No, it won't. PWM can be used as a dimmer, but it does not change the
LED efficiency. Running an LED at 1Amp half-duty PWM is the same as
running DC at 500mA. Your eye integrates the signal. The only time when
PWM can increase efficiency is in nonlinear devices. LED's power-out vs
current-in is very linear. PWM is used where a digital signal is used
to control the LED brightness. It is often easier to turn the LED on
and off at a fixed current, rather than changing the current.
wrote:
Won't pulse modulation take care of the heat problem.