Resistor calculation for 1.5V LEDs

Discussion in 'FAQs' started by bigdonnie, Dec 13, 2004.

  1. bigdonnie

    bigdonnie Member

    I bought a Radio Shack Shack switchable DC power supply that (supposedly) can put out 1.5, 3, ... up to 12 VDC. I bought it to supply 1.5 volt Miniatronics LEDs, but unfortunately it puts out 3.5 volts instead of the 1.5 (whatever happened to the time when you could count on Radio Shack :mad: ).

    I need to calculate what resistance I need to add to limit the voltage getting to the LED to 1.5. I found a web site that has the following formula for doing this calculation:

    Resistance = (Supply voltage - Voltage drop across LED)
    Desired Current

    Here is my question to those of you who are more electrically inclined than I --- what is the typical voltage drop across a white LED? The same web site I found for the formula says 3.6 volts, but this would give me a negative value.

    Any assistance would be greatly appreciated.

  2. Nscalemodeler

    Nscalemodeler Member

    You can use this link. I think it is a little better than what you may have found. Scroll to the section labeleb "Current Limiting" it is about half way down the page.

    The calculation that you are looking for is how many ohms should the resistor be to make up the difference of the LED compared to the power supply.

    The voltage that you said is used by the LED is 1.5v. The power supply provides 3.5v, therefore there is 2v of residual voltage that needs to be used by the resistor. You need to know what the current draw is for the LEDs in order to finish the calculation.

    Resistor Value = (residual voltage/current flow)


    Resistor Value = (2v/LED Current Flow)

    If you plug in the LED current flow in amps, you should get the resistor value in ohms. If you plug in the LED current flow in milliamps, which is probably how the LED is rated, then you will get the resistor value in kilo-ohms.

    Then all you need to do is find a resistor with the closest value to the one you calculated and a current rating of more than your LED current rating so as to not burn out the resistor.

    Let me know if this doesn't make sense.:wave:
  3. Fred_M

    Fred_M Guest

    First. the powerpack you purchased is a simple power pack and is rated 1.5 volts at a certain current flow, say 300ma. So unless the current drain is 300 ma it will supply a higher voltage under 300ma or lower voltage if you pull over 300ma. That's because it uses resistors rather than being a switching powerpack, which is what you really wanted. Nothing wrong with the thing you purchased, just the wrong device for the application. Anyway, here's an online calculator for what you want. and if i punch in 3 volt supply, 1.5 led forward voltage, and 20ma current it says 82 ohm. Fred
  4. N Gauger

    N Gauger 1:20.3 Train Addict

    Did you take that voltage reading right out of the power supply, without anything attached to it.....I was always taught to take a voltage reading across a resistor.

    A power supply will always give "fake readings when not attached to anything.....(Unlike a "real" power supply, those plug in adapters are reallly hard to get a good reading on)

    I tried the same thing you are doing - I dont remember what size resistors I used - but I do remember it was fairly huge resistance... I'll look when I get home :) :)
  5. bigdonnie

    bigdonnie Member

    Thanks for the feedback and the suggested web sites --- I always run across things I'm not certain about when I'm working with electricity/electronics (There was a good reason I only got 53% in electric circuits in university! :D ) --- now I've got some places to check out first.

    Mikey --- I think the direct voltage reading from the power source is 'real' --- the multimeter is acting as the 'resistor' if you will. Guys, please correct me if I'm wrong.

  6. trains1972

    trains1972 Member

    Taking readings from the power supply will give you the voltage from the there. Taking the voltage across the resistor will give you the voltage drop across the resistor. 3.5V for the power supply minus the 1.5 volts needed for the LED. The voltage across the resistor has to be a 2V drop. The formula for finding the resistance is voltage/current ohms law.
  7. N Gauger

    N Gauger 1:20.3 Train Addict

    Yeah - That's it - The resistor voltage drop is what you get - I had it backwards :)
  8. Fred_M

    Fred_M Guest

    And if you use ohms law you get 75 ohms with 1.5 volt drop at 20ma, and since you alway give a tad extra resistance you use an 82. Same answer. If you want to use 2 voltsinstead of the 2 volts in the original question (3-1.5=1.5) you get a 100 ohm. Fred
  9. Pete

    Pete Member

    A white LED is generally rated for 3.3 to 3.6 volts @ 20mA for maximum brightness and longevity, but they will start to light up with about 2.2 volts applied to them. The 3 volt output will give a nice bright light, and long life to boot.


Share This Page