Sorry, I missed this somehow.
The 1.4 amps is probably the max amps, and the 700 is the design or working amps. For long life, and max efficiency, you want to run close to that 700ma number, but they can be pushed to 1.4 amps if you dont mind running less efficiently (more energy wasted as heat for each watt of input) and shortening their life to some degree. There should be another spec for voltage and/or maybe a graph of voltage vrs current. The relationship is not linear, and generally there is a very narrow voltage band, but that depends on the exact diodes used in the strip.
Edit: sorry - that answer just made things seem more complicated and probably didnt help you decide anything ![]()
I think I said this somewhere, but you dont need to worry about current that much. The driver will handle the current.
To keep it simple you need to look at total voltage in series, and total current in parallel.
I would use the 1.4 amp number. Then multiply that 1.4 by the total number of strips in parallel. That will be the max amps the driver will need to supply - if you over drive them. I would go maybe 5%-10% higher to be safe, but Meanwell drivers are bullet proof, so just getting close is fine.
Then add up all the voltages of the strips in series and that gives you the max voltage for the driver.
You might want to do a simple drawing of how you plan to wire the fixture and post a link to the specs or a pic of the spec sheet. Then we can double check it for you.