Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Nonlinear] add SymbolicAD submodule #2624

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

[Nonlinear] add SymbolicAD submodule #2624

wants to merge 2 commits into from

Conversation

odow
Copy link
Member

@odow odow commented Feb 13, 2025

Twas prototyped in lanl-ansi/MathOptSymbolicAD.jl#39

I'm open to arguments against adding this. The evaluator could arguably remain in the separate MathOptSymbolicAD.jl package. But the simplification and symbolic derivative stuff is more widely useful. I occasionally get questions on the forum asking how to compute the derivative of a JuMP expression.

TODO

Related issues

This doesn't close any issues directly, but it relates to a number of open and previously closed issues:

It currently lacks support for multivariate user-defined functions, because these require #2402

Use with JuMP

We require only a few trivial helper methods in JuMP to link this all up:

julia> using JuMP

julia> function derivative(model::GenericModel{T}, f, x) where {T}
           df_dx = MOI.Nonlinear.SymbolicAD.derivative(moi_function(f), index(x))
           return jump_function(model, MOI.Nonlinear.SymbolicAD.simplify!(df_dx))
       end
derivative (generic function with 2 methods)

julia> derivative(f, x) = gradient(owner_model(x), f, x)
derivative (generic function with 2 methods)

julia> function gradient(model::GenericModel{T}, f) where {T}
           g = moi_function(f)
           ∇f = Dict{GenericVariableRef{T},Any}()
           for xi in MOI.Nonlinear.SymbolicAD.variables(g)
               df_dx = MOI.Nonlinear.SymbolicAD.simplify!(
                   MOI.Nonlinear.SymbolicAD.derivative(g, xi),
               )
               ∇f[GenericVariableRef{T}(model, xi)] = jump_function(model, df_dx)
           end
           return ∇f
       end
gradient (generic function with 2 methods)

julia> gradient(f) = gradient(owner_model(f), f)
gradient (generic function with 2 methods)

julia> model = Model();

julia> @variable(model, x[1:2]);

julia> f = sin(x[1]) + log(x[2])
sin(x[1]) + log(x[2])

julia> gradient(f)
Dict{VariableRef, Any} with 2 entries:
  x[1] => cos(x[1])
  x[2] => 1.0 / x[2]

@chriscoey
Copy link
Contributor

Nice. Sorta vague question: could the rewriting/simplifying NL expressions be moved to where we do conceptually similar things for affine/quadratic functions? So we would be simplifying SNFs. Perhaps in some cases the rewrites could simplify an SNF to a SAF/SQF.

@odow
Copy link
Member Author

odow commented Feb 20, 2025

Yeah open question.

For now I've kept things separate from where we canonicalize affine/quadratic stuff, because we generally expect callers to use canonicalize in various places, for it to be type stable, and for it to have O(N) performance. We don't for example, canonicalize ScalarQuadraticFunction to ScalarAffineFunction if there are no quadratic terms.

simplify doesn't have these properties, and I don't think we want to start simplifying every nonlinear expression provided by the user (e.g., from JuMP). This really needs to be opt-in. And if it's opt-in, it kinda wants to be near the rest of the tools for working with symbolic expressions.

Perhaps in some cases the rewrites could simplify an SNF to a SAF/SQF

Yes, this is on my TODO list. We can certainly do better at detecting linear subexpressions within a SNF.

@joaquimg
Copy link
Member

This would be very useful for: jump-dev/ParametricOptInterface.jl#144
So, POI can support parameters in quadratic objectives.

POI only needs cubic function simplifcation, ideally in: sum of cubic terms + SQF.

Then, POI would be used in DiffOpt so that we can easily differentiate wrt coefficients in the quadratic objective function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

Symbolic AD of ScalarNonlinearFunction
3 participants