Interesting, it must be with the boards then. Because thats in direct opposition to what was taught in class. Series or Parallel, we were taught that as long as voltage was proper (12v for me), that wattage would not matter. The scenario you describe is exactly how halogen bulbs burned out. Cast was high end stuff too, they used multi-tap transformers, so they had 12 to 20v taps, you measured at the splice and adjusted accordingly for the length of the run with them (ie splice measured 11v you moved the wire to the 13v tap so the splice would measure at 12v then).
I realize the different application, but figured the technology behind it was the same. Gonna have to do more research now for the drivers on our lights LEDS vs how the landscape was handled (thinking perhaps cheap 12v drivers @ the lights, and 12v is just the power source as much as 120v is for our lights?) Just cause I gotta know the difference / reasoning…
Yea, the reason is that the drivers are not fixed voltage, but fixed current which isn’t as common. The reason is that fixed voltage is not considered safe for LEDs. Unlike a lot of electronic components, when LEDs heat, they consume more power. So if you have a fixed voltage, then the will start drawing more and more current.
Instead, the power supply for LEDs keep a fixed current. So when the LEDs heat up, it will drop the voltage slightly to keep it in control.
It depends on how the gear was designed. I have seen older circuits where constant current is used. Mainly when it is a packaged unit and the user is not going to do anything but plug in the light. A driver where you do not know what will be hooked up to it will be designed differently.