Variable scaling
I just scaled FlexTool variables to be of similar size, since there were some models that took very long time to solve using HiGHS. To my surprise, this improved LP solution time for FlexTool by orders of magnitude when using HiGHS. I did limited testing with CPLEX as well and there was no clear impact for CPLEX - presumably CPLEX knows how to scale itself. So, I suggest making sure Backbone models are scaled well.
Since integer variables are typically 0 or 1 and even in case of multiple units typically not too far from 1, all variables should be scaled to that range [0-1....] (e.g. including a multiplication of the flow variable by the unit capacity in the constraints and then translating that to right number in the results). It can be difficult with investment variables since the maximum is not known (or it can be set very high by the user to avoid hitting any limits). So, I decided to scale based on 'virtual_unitsize' that I have used for integer startups. If it's not set, then FlexTool used existing capacity and if there was not existing capacity, then just 1000 MW as a default assumption for the 'virtual_unitsize'.
I also scaled the objective function - it seems to be not an issue unless there is a difference of many orders of magnitude between the other variables and the objective value. And this can be so, if not scaled at all.
Sister issue in SpineOpt: https://github.com/spine-tools/SpineOpt.jl/issues/585