Skip to content

Conversation

@amontoison
Copy link
Member

@amontoison amontoison commented Dec 1, 2025

Add flags in NLPModelsMeta and NLSMeta to specify whether gradient, sparse Jacobians, sparse Hessians, and operator-based products are available in a model.
In some models, we do not want to, or cannot, implement the complete NLPModels.jl API.

Examples:

  • ADNLPModels.jl: we do not want to set up some AD backends if they are not needed (see issue Disable gradient and Hessian backends for NLSModels (part 2) ADNLPModels.jl#360).
  • NLPModelsJuMP.jl: the user can specify from JuMP which subset of derivatives is needed, and the new VectorNonlinearOracle structure in MOI does not support operator–vector products.
  • Custom AbstractNLPModel implementations: at Argonne, we have some models involving neural networks where only the gradient is available (cc Sarah).

This is an issue for solvers, because solvers such as MadNLP.jl or MadNCL.jl expect jtprod to be implemented but cannot easily know whether it is available before calling it.
A similar issue occurs with UnoSolver.jl, which relies on the BQPD subsolver by default and requires hprod.
The absence of the Lagrangian Hessian can also help solvers like NLPModelsIpopt.jl or NLPModelsKnitro.jl to automatically switch to quasi-Newton approximations.

Using these new attributes also helps an oracle choose the most appropriate solver, and ensures that a clean error is returned when a solver cannot be used with a given model (JSOSuite.jl?).
This is preferable to triggering a missing method error.

This addition should be non-breaking (the full API is considered available by default) and should resolve a number of issues in dependent packages.

@github-actions
Copy link
Contributor

github-actions bot commented Dec 2, 2025

Package name latest stable
ADNLPModels
AdaptiveRegularization
AmplNLReader
BundleAdjustmentModels
CUTEst
CaNNOLeS
DCISolver
FletcherPenaltySolver
FluxNLPModels
JSOSolvers
JSOSuite
LLSModels
ManualNLPModels
NLPModelsIpopt
NLPModelsJuMP
NLPModelsKnitro
NLPModelsModifiers
NLPModelsTest
NLSProblems
PDENLPModels
PartiallySeparableNLPModels
PartiallySeparableSolvers
Percival
QuadraticModels
RegularizedProblems
SolverBenchmark
SolverTest
SolverTools

Copy link
Member

@tmigot tmigot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, the implementation of NLPModels should set one of this flag to false is something is not implemented. We should also document it somewhere, if no obvious place in the NLPModels doc maybe the docstrings of the function to implement.

What is the motivation for this? Is that automatic solver selection ?

In ADNLPModels, it can even be automatic by checking whether EmptyADBackend is mentioned.

@amontoison
Copy link
Member Author

@tmigot The motivation is here: #524 (comment)

@amontoison
Copy link
Member Author

The main issue is that is more models than before can't implement all the API (NN models, Oracle in MOI) and the solvers like MadNLP.jl or UnoSolver.jl can't handle them.

@amontoison
Copy link
Member Author

amontoison commented Dec 3, 2025

In ADNLPModels, it can even be automatic by checking whether EmptyADBackend is mentioned.

Too much specific, we don't want to have ADNLPModels.jl as a dependency of MadNLP.jl, NLPModelsIpopt.jl, NLPModelsKnitro.jl or UnoSolver.jl.
It was what we need if we want to check the lack of some functions in the solver.

The idea of NLPModels.jl is to provide an unified API for all AbstractNLPModel.

@tmigot
Copy link
Member

tmigot commented Dec 3, 2025

Filling the attributes in the meta can be handled differently by the different implementation of nlpmodels. For most, it will be manually put, but ADNLPModels ca ba automatic. It does not add dependencies to adnlpmodels later on.

@amontoison
Copy link
Member Author

amontoison commented Dec 3, 2025

Yes, I agree.
I didn't understand correctly your message. I thought that you wanted the detection of missing API in the optimization solvers for some ADNLPModels.

@amontoison
Copy link
Member Author

amontoison commented Dec 3, 2025

So, the implementation of NLPModels should set one of this flag to false is something is not implemented. We should also document it somewhere, if no obvious place in the NLPModels doc maybe the docstrings of the function to implement.

I think it should be in the docstrings of the function, users will know that nlp.meta.*_available can be checked before calling the related routine.

@tmigot
Copy link
Member

tmigot commented Dec 3, 2025

So, the implementation of NLPModels should set one of this flag to false is something is not implemented. We should also document it somewhere, if no obvious place in the NLPModels doc maybe the docstrings of the function to implement.

I think it should be in the docstrings of the function, users will know that nlp.meta.*_available can be checked before calling the related routine.

If you don't do it in this PR, please open an issue about it, thanks!

@amontoison amontoison changed the title [backport] Add availability flags in NLPModelMeta and NLSMeta Add availability flags in NLPModelMeta and NLSMeta Dec 9, 2025
@amontoison amontoison changed the base branch from 0.21.x to main December 9, 2025 13:50
@amontoison amontoison requested a review from tmigot December 9, 2025 17:38
@amontoison
Copy link
Member Author

So, the implementation of NLPModels should set one of this flag to false is something is not implemented. We should also document it somewhere, if no obvious place in the NLPModels doc maybe the docstrings of the function to implement.

I think it should be in the docstrings of the function, users will know that nlp.meta.*_available can be checked before calling the related routine.

If you don't do it in this PR, please open an issue about it, thanks!

Done, I added references everywhere in the docstrings and documentation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants