Hello Community
I am new to this site but found it to have some of the best content and member help. I have taken some classes in electronics & power management, I feel I have a good understanding of most electrical concepts. I am very familiar with Ohm's law, power, current, etc. I have read many articles about LED 'current' drivers - and constant voltage power supply. I have yet to find a real world situation I can relate to so I am asking here, hoping someone can help me understand my exact power/current needs.
I ordered 3 (16') rolls of Blue LEDs to build my 8' christmas star. It has 5 arms and requires 40' of total length = to about 750 individual LEDs.
There is no type of identification/specifications other than 12V, on the packaging or light strips themselves. The strips are typical (I believe) with 3 leds in series. The series circuit is repeated approximately 50 times per strip connecting each series to the + and - outside 'legs'.
This setup provides a nominal 4v to each LED
This is where the Current/Voltage requirements gets confusing, and please correct me where I am wrong.
I consider a 12V battery a power supply, that has neither a constant voltage or current (they both decrease with usage).
On paper though - if I connected (two) 12v 2amp (reserve/mfg. rating) batteries in a series circuit (The + of one to the - of the other) I would have a 24v power supply with still only 2 amp reserve.
If I connected the same two batteries in parallel I would now have a 12v power supply with 4amps of available current/reserve.
To get a 4amp reserve in 24v I would need 4 of these batteries. Setting them up first in a series (24v 2A), then parallel combination (24v 4A)
A AC to DC transformer/power supply with 24v DC 4000ma (rated) output provides basically a more reliable & stable version if the same thing.
If I connected an item requiring 24v that was consuming 50watts it would have a current requirement of approximately 2amps.
By having 4amps available that just extends the duration the item can operate at the set voltage, correct?
The extra available 2amps doesn't have any negative effects on any part of the circuit?
I don't understand how this relates to an LED getting too much current & 'burning' out
If I connected my Star to my Vehicle battery 12V nominal (11.5-13.6v actual) it has hundreds of Amps available. It would run hours before the voltage dropped below the threshold necessary to light the LEDs, the amp/current potential never being an issue.
I first connected the star to a 120v to 12v power supply rated at 2amp, it looked great. Then after about 30 min I noticed through my window there was a blue flashing light. the star was flashing on it's own at different intervals (between 1-5 seconds).
I connected my Kill-a-watt meter to it and noticed the wattage usage was dropping from the normal 43 to a low of 13-18watts. Figuring it was over heating I let it cool and connected a dimmer switch just to see what would happen thinking if it lasted 1/2 hr it wasn't that overloaded. With the dimmer in any position it still used 43 watts which surprised me, and once again it started flashing a little later.
Lastly I connected a 12v 4a power supply. It has been on for about 24hr now with no issues., what should I be concerned of? What would a LED "Driver" do here differently? I would have just started with the 12v 4a, but I didn't want to "over current".
Any info, specific or generalization is appreciated
Thank you,
Dustin