The reason current is brought up is because that is how you'd optimally drive an LED.
You can do the same thing with a voltage mode supply but you have to get things to match-up with the configuration of the LEDs such that both the voltage and the current are within the desired range.
With a current mode supply, you essentially command the amount of current you want to feed through the LEDs and the voltage will vary (up to a maximum) to make it "happen". The voltage drop of the LEDs will vary depending on the current being driven and the supply will respond.
With a voltage mode supply, you vary the voltage which will produce a current based on the configuration of the LEDs. This is more tricky to get right. The voltage drop of the LEDs will vary as the current through them changes. And, the voltage drop will probably vary over the lifetime and temperature of the LEDs as well.
For instance, if you put a 48V voltage source across a 3V LED, disaster. Current is unbounded. If you put a 1A limited current source across an LED, the supply will vary the voltage up until 1A is achieved. Assuming the LED is rated for 1A, the voltage should be within the proper range for the LED and all is well.
As another example, let's assume you have a 6 volt voltage mode supply. Across the supply you have two LEDs in series that just happen to have a 3V drop with 1A of current. Everything is kosher. 3Vx2 = 6V drop = 1A current. But what happens if one of the LED shorts out? The supply will try to maintain that 6V across them. In order to do so, it will increase the current output (up to a maximum). Assuming the LED has a linear current/voltage relationship, the current will increase to 2A to get 6V. If the single working LED is not rated for that current, it will consume itself.
On the other-hand, let's assume you have a 1A current mode supply. Across the supply you have two LEDs in series that just happen to have a 3V drop with 1A of current. Everything is kosher. 1A = 6V drop. Supply is now at 6V. But what happens if one of the LED shorts out? The supply will try to maintain that 1A across the LEDs. In order to do so, it will decrease voltage until 1A is produced. The voltage is reduced to 3V. A safer, but not foolproof, scenario.
You can do either, voltage mode or current mode, but the current mode is the preference for both ease and safety.
Constant power output probably means 1800W. So, if your voltage drop is 32V, then the supply is capable of driving 56 amps while it attempts to produce the 48V across the 32V drop. And, it will do so if allowed. E.g. the potential need for power resistors inline to limit current depending on the configuration.
It's a very nice supply on the cheap. I might buy one for other uses. Nothing wrong with what you are thinking. But it will take some care to utilize it for LEDs, I'd think.