-
Notifications
You must be signed in to change notification settings - Fork 99
Description
What is the minimal API for differentiable optimization?
- Attributes to pass input forward and reverse sensitivities.
- forward inputs sensitivities: are numbers associated with the coefficients of the problem or parameters
- reverse inputs sensitivities: are numbers associated with variables
- functions to perform the diff (analogous to optimize!)
- Attributes to query output forward and reverse sensitivities.
- forward outputs sensitivities: are numbers associated with variables
- reverse outputs sensitivities: are numbers associated with the coefficients of the problem or parameters
Important aspects:
a. A function to reset all input sensitivities is extremely important in practice, as resetting manually is very inefficient.
b. Parameters are extremely useful here. Having used DiffOpt and talking to users it is clear that it is very error-prone to touch coefficients directly (also see: https://arxiv.org/abs/2510.25986). POI is almost essential here.
c. Passing coefficients individually is suboptimal because it does not play well with bridges. Passing coefficient sensitivities as functions is key to making it work smoothly. For getters, Lazy operations are importants as we are usually no interested in absolutely every single coefficient
d. In practice, helper methods for objective sensitivity are a lifesaver.
e. Having a JuMP level API is much more ergonomic than having MOI.set and get all around.
We already have use case for existing solvers:
- https://github.com/jump-dev/DiffOpt.jl/tree/jg/lpbasis (HiGHS, Gurobi, Xpress)
- MadNLP (https://github.com/MadNLP/MadDiff.jl/tree/main/ext/MathOptInterfaceExt)
There is a new solver that proposes to do it as well: - Moreau (Adds support for differentable-native solvers DiffOpt.jl#344)
Additional points:
- In DiffOpt, we did not put much effort into sensitivities associated with duals. Attribute-wise, there is nothing special here, but we never have a concrete use case, so we did not venture into it.
Initial discussion in: jump-dev/DiffOpt.jl#344