Releases: JuliaSmoothOptimizers/ADNLPModels.jl
Releases · JuliaSmoothOptimizers/ADNLPModels.jl
v0.8.3
ADNLPModels v0.8.3
Merged pull requests:
- Add jprod benchmark (#256) (@tmigot)
- Add benchmark analyzer (#259) (@tmigot)
- Add gradient benchmark CI (#261) (@tmigot)
- Add more benchmark CI (#262) (@tmigot)
- Fix benchmarks (#263) (@tmigot)
- Add benchmark folder README (#265) (@tmigot)
- Faster Hessian package benchmark (#267) (@tmigot)
- Skip if too many constraints (for instance
polygon) (#268) (@tmigot) - Optimize sparse AD (#269) (@amontoison)
- Use a symmetric coloring for the computation of sparse Hessians (#271) (@amontoison)
- Update benchmark/Manifest.toml (#272) (@amontoison)
- Symmetric decompression (#273) (@gdalle)
- Don't test on nightly (#274) (@gdalle)
- [Benchmarks] Update Manifest.toml (#275) (@amontoison)
- [documentation] Use Documenter.jl v1.0 (#276) (@amontoison)
Closed issues:
- Symmetric coloring for efficient sparse hessian computation (#258)
v0.8.2
ADNLPModels v0.8.2
Merged pull requests:
- Full support of sparse AD for NLS models (#239) (@amontoison)
- Add basic gradient benchmark (#248) (@tmigot)
- [documentation] Explain coloring and detector kwargs in sparse backends (#249) (@amontoison)
- Add more benchmark: jac and hess (#251) (@tmigot)
- Pass empty function instead of unused objective in NLS backend (#253) (@tmigot)
Closed issues:
v0.8.1
ADNLPModels v0.8.1
Merged pull requests:
- Replace ColPack with SparseMatrixColorings (#244) (@gdalle)
- [documentation] Fix the table of AD backends (#245) (@amontoison)
Closed issues:
v0.8.0
ADNLPModels v0.8.0
Merged pull requests:
- Use SparseConnectivityTracer.jl (#230) (@amontoison)
- Add OptimalControl.jl in Breakage.yml (#233) (@amontoison)
- Upgrade to ColPack v0.4 (#235) (@gdalle)
Closed issues:
- Bug with upgrade to SymbolicUtils release 1.6 and 1.7 (#232)
v0.7.2
ADNLPModels v0.7.2
Merged pull requests:
v0.7.1
ADNLPModels v0.7.1
Merged pull requests:
- Remove duplicate API for NLS (#200) (@tmigot)
- Remove Zygote in performance (#201) (@tmigot)
- Add new constructors for mixed models (#202) (@tmigot)
- Enable gpu for bound-constrained problems (#208) (@tmigot)
- [JSOTemplate.jl] Template compliance update (#211) (@abelsiqueira)
- Add CITATION.cff (#212) (@tmigot)
- Fix missing link in mixed.md (#213) (@tmigot)
- Bump to NLPModelsModifiers 0.7 (#214) (@tmigot)
- Extend show for NLS backends (#216) (@tmigot)
- Bump to NLPModels 0.21 (#219) (@tmigot)
- Version 0.7.1 (#220) (@tmigot)
Closed issues:
v0.7.0
ADNLPModels v0.7.0
Closed issues:
- Improve docs (#32)
- Tutorial: get started with ADNLPModel (#49)
- Add documentation on sparse jacobian (#106)
- Add documentation on available backends and new backend system (#135)
- Warnings about unused type parameters (#156)
- WARNING: method definition ... declares type variable AD but does not use it (#159)
- Test using NLPModel as backend using NLPModelsTest (#178)
Merged pull requests:
- Add optimized reverse
jtprod(#121) (@tmigot) - Add test/Project.toml (#146) (@tmigot)
- Add timer to backend constructors (#147) (@tmigot)
- Split files for AD backends definitions (#149) (@tmigot)
- Specialize unconstrained backend (#150) (@tmigot)
- Add ForwardDiff optimized
jprod(#151) (@tmigot) - Add ReverseDiff generic gradient (#152) (@tmigot)
- Add optimized ForwardDiff jtprod (#153) (@tmigot)
- Add
Enzymegradient (#155) (@tmigot) - Remove unused types in ADNLSModels (#157) (@tmigot)
- Dispatch hprod for
ci,lagandobj(#158) (@tmigot) - Add hprod with ForwardDiff [WIP] (#160) (@tmigot)
- Bump NLPModelsTest to 0.9 (#162) (@tmigot)
- Update computation sparse jacobian (#163) (@tmigot)
- pre-allocate result of direction derivative in sparse matrix comp (#164) (@tmigot)
- Use ColPack instead of SparseDiffTools for coloration (#166) (@tmigot)
- Optimize hprod in sparse hessian (#167) (@tmigot)
- [CI] Remove the build with Linux ARMv8 (#170) (@amontoison)
- Add reverse-forward hprod (#171) (@tmigot)
- Add SparseDiffTools backends (#172) (@tmigot)
- Add predefined backends (#174) (@tmigot)
- Add NLPModel instead of an AD-backend (#176) (@tmigot)
- Add Symbolics as optional dep (#177) (@tmigot)
- Add test for mixed models (NLP) (#180) (@tmigot)
- Add NLS tests for mixed models (#182) (@tmigot)
- Update readme and docs (#187) (@tmigot)
- Version 0.7.0 (#188) (@tmigot)
- Add predefined backend tutorial (#190) (@tmigot)
- Add performance tips tutorial (#191) (@tmigot)
- Build a hybrid model tutorial (#192) (@tmigot)
v0.6.2
ADNLPModels v0.6.2
Closed issues:
- Use SparseDiffTools with ADNLPModel? (#13)
- Support sparse Hessians and Jacobians (#44)
- "UndefVarError:
ZygoteADnot defined" when trying to use Zygote as AD backend (#134)
Merged pull requests:
- Sparse hessian (#127) (@amontoison)
- Rename symbolic backends (#136) (@tmigot)
- Rename SparseForwardADJacobian backend (#139) (@amontoison)
- Fix bug for unconstrained ghjvprod (#142) (@tmigot)
- Compute coloration on the whole Hessian matrix instead of triangle (#143) (@tmigot)
- Version 0.6.2 (#145) (@tmigot)
v0.6.1
ADNLPModels v0.6.1
Merged pull requests:
- Update files related to sparse Jacobian (#125) (@amontoison)
- Sparse hessian with Symbolics.jl (#126) (@amontoison)
- Add NLS in-place constructors with no nonlinear constraints (#128) (@tmigot)
- Update SparseADHessian structure for an optimized hess_coord! (#130) (@amontoison)
- Fix API for sparse hessian (#131) (@tmigot)
- Version 0.6.1 (#132) (@tmigot)
v0.6.0
ADNLPModels v0.6.0
Closed issues:
- Use in place functions by default for constraints and residuals (#47)
- Unit tests segfault on M1 Mac (#57)
Merged pull requests:
- Setup in-place constructors (#101) (@tmigot)
- Add abstract AD types (#103) (@tmigot)
- Add in-place residual (#104) (@tmigot)
- Add SparseDiffTools jacobian (#105) (@tmigot)
- Update CommentPR.yml (#107) (@amontoison)
- Optimize src/ad.jl (#110) (@amontoison)
- [CI] Update .cirrus.yml (#112) (@amontoison)
- Refactor lagrangian function (#113) (@tmigot)
- Add sparse jacobian of residual (#114) (@tmigot)
- Prep jprod! (#116) (@tmigot)
- CompatHelper: bump compat for SparseDiffTools to 2, (keep existing compat) (#117) (@github-actions[bot])
- Increase coverage by removing out-of-place gradient function (#118) (@tmigot)
- Add optimized
jprodwith forward and reverse (#119) (@tmigot) - Optimized
hprodusing SparseDiffTools (#122) (@tmigot) - CompatHelper: bump compat for NLPModels to 0.20, (keep existing compat) (#123) (@github-actions[bot])
- Version 0.6.0 (#124) (@tmigot)