-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add [Nonlinear.ReverseAD] submodule #1803
Conversation
This is too big to review. One possible split is 1) introduce the nonlinear data structures into MOI 2) add the ReverseAD backend |
Does this only introduce symbolic nonlinear and not oracle-based nonlinear? |
Yeah. Now that it's in MOI, I'll break it back up again.
It is (just) a refactoring of JuMP's current nonlinear interface. But it's also designed to be extensible, so that SymbolicAD is now an add-on: lanl-ansi/MathOptSymbolicAD.jl#17 optimize!(model; differentiation_backend = SymbolicAD.DefaultBackend())
optimize!(model; differentiation_backend = MOI.Nonlinear.SparseReverseMode()) |
having to interpolate can make some friction, we should offer a |
That's JuMP's job. It doesn't matter if this interface has some friction. |
2d301f5
to
aabe99b
Compare
The majority of this development was carried out in the JuMP PRs: * jump-dev/JuMP.jl#2939 * jump-dev/JuMP.jl#2942 * jump-dev/JuMP.jl#2943 Nonlinear.ReverseAD is a minor refactoring of code that previously existed in JuMP under the _Derivatives submodule, and prior to that the ReverseDiffSparse.jl package.
create a [`Nonlinear.Evaluator`](@ref) object with | ||
[`Nonlinear.SparseReverseMode`](@ref), and then query the MOI API methods. | ||
|
||
### Why another AD package? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I reject the premise of the question because this code was one of the first implementations of reverse-mode AD in Julia. This discussion as worded doesn't really fit as package documentation. What would fit is a discussion of the design goals of ReverseAD
. It's fair to mention the history of the code as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How's this now? I only made minor changes to split the history and identify design goals. But can make larger changes if you want?
Co-authored-by: Mathieu Besançon <[email protected]>
Co-authored-by: Mathieu Besançon <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops, I left a comment pending.
Testing this as a submodule of MOI to see what the impact would be on JuMP
Docs: https://jump.dev/MathOptInterface.jl/previews/PR1803/submodules/Nonlinear/overview/
Before merging:
Rename toAdded a warning to the docs._Nonlinear
, or otherwise indicate that the API is experimental and subject to change.The majority of this development was carried out in the JuMP PRs:
Nonlinear.ReverseAD is a minor refactoring of code that previously
existed in JuMP under the _Derivatives submodule, and prior to that
the ReverseDiffSparse.jl package.
Once merged open issues for:
lhs - rhs <= 0
Nonlinear: change how constraints are parsed #1817