When I want to know what value current-limiting resistor to use with an LED, the easiest thing to do is assume 5V supply / 15-20mA / old-school LED / 220-330Ω resistor, like we all learned from Forrest Mims in the 70s and 80s.
And then I tweak the resistor until the LED is as bright as I like, by trial and error.
Oh, most of the time I use my LED tester to estimate the current I need for the brightness I like. At $6-10, it’s indispensible. Also double-checks anode/cathode pinning and tells me if an LED is burned out.
But running on a 7.2V supply (NiMH “9V” battery) with fixed current-limiting resistors designed for a 9V supply and low-forward-drop LEDs, it’s not really representative of what’s going to happen when I put a newer-chemistry LED with a 3.5-4V drop into my 5V circuit. So it’s back to trial and error.
I want a microcontroller-based LED tester/calculator that:
- has a socket for the LED
- has controls to set the target-circuit supply voltage
- has a current knob to turn until the LED is at a brightness I like
- displays on an LCD the target circuit supply voltage, the LED voltage drop, the LED current, and the value of resistor to achieve that current with that supply voltage
(That is, all the facts about the LED I’m testing, and the resistor value I need to use in the target circuit)
- is about the size of my existing LED tester
Note that the tester need not be running at the target circuit’s voltage; we can calculate resistor values for arbitrary voltages once we know the desired LED current. It does need to have a higher supply voltage than the forward drop of new-fangled LEDs, and I’m inclined to run it on a 9V for compactness and simplicity.
I’ve been kicking this around for quite a while, and I’d really like to make one. And publish the plans and code for DIYers and make circuit boards and kits for kitbuilders.
Is that the right feature set? I personally never put LEDs in series, but should the tester be able to calculate for that anyway? What else?