Skip to content

Releases: TuringLang/AdvancedVI.jl

v0.5.0

21 Oct 05:51
c2b65fa

Choose a tag to compare

AdvancedVI v0.5.0

Diff since v0.4.1

v0.5 introduces major breaking changes to the interface of AdvancedVI along tweaks to the default configurations. The changes are summarized below.

Default Configuration Changes

The default parameters for the parameter-free optimizers DoG and DoWG has been changed.
Now, the choice of parameter should be more invariant to dimension such that convergence will become faster than before on high dimensional problems.

The default value of the operator keyword argument of KLMinRepGradDescent has been changed to IdentityOperator from ClipScale. This means that for variational families <:MvLocationScale, optimization may fail since there is nothing enforcing the scale matrix to be positive definite.
Therefore, in case a variational family of <:MvLocationScale is used in combination with IdentityOperator, a warning message instruting to use ClipScale will be displayed.

Interface Changes

An additional layer of indirection, AbstractVariationalAlgorithms has been added.
Previously, all variational inference algorithms were assumed to run SGD in parameter space.
This design however, has proved to be too rigid.
Instead, each algorithm is now assumed to implement three simple interfaces: init, step, and output.
As a result, the signature of optimize had to be revised, but is now simpler, as most configurations will reside as a field to the <:AbstractVariationalAlgorithms object.

A new specialization of estimate_objective have been added that takes the variational algorithm alg as an argument.
Therefore, each algorithm should now implement estimate_objective.
This will automatically choose the right strategy for estimating the associated objective without having to worry about internal implementation details.

Internal Changes

The state of the objectives state may now use a concrete type.
Therefore, to be able to dispatch based on the type of state while avoiding type ambiguities, the state argument in estimate_gradient! has been moved to the front.

Under the new interface AbstractVariationalAlgorithms, the algorithms running SGD in parameter space, currently KLMinRepGradDescent, KLMinRepGradProxDescent, KLMinScoreGradDescent, are treated as distinct algorithms.
However, they all implicitly share the same step function in src/algorithms/common.jl and the same fields for the state object.
This may change in the future.

Merged pull requests:

  • Add additional layer of indirection AbstractAlgorithm (#179) (@Red-Portal)
  • CompatHelper: bump compat for AdvancedVI to 0.4 for package bench, (keep existing compat) (#181) (@github-actions[bot])
  • CompatHelper: bump compat for AdvancedVI to 0.4 for package docs, (keep existing compat) (#182) (@github-actions[bot])
  • Make problem as part of the state of ParamSpaceSGD instead of an argument of estimate_gradient! (#185) (@Red-Portal)
  • Support Mixing AD Frameworks for LogDensityProblems and the objective (cleaned-up) (#187) (@Red-Portal)
  • Split AD tests into workflows (#190) (@Red-Portal)
  • Stop testing on DistributionsAD (#191) (@Red-Portal)
  • Fix bug in adtype check for Mixed-AD RepGradELBO (#192) (@Red-Portal)
  • update the example in the README (#193) (@Red-Portal)
  • Create docs/src/tutorials/ directory (#194) (@Red-Portal)
  • Add some basic tutorials (#195) (@Red-Portal)
  • Use concrete types for objective states (#196) (@Red-Portal)
  • Add minibatch subsampling (doubly stochastic) objective (2025 ver.) (#197) (@Red-Portal)
  • Fix coverage to aggregate results for all worfklows (#198) (@Red-Portal)
  • Update the default parameter value for DoG and DoWG. (#199) (@Red-Portal)
  • Move q_init argument of init to the front (#200) (@Red-Portal)
  • Make IdentityOperator default for KLMinRepGradDescent (#201) (@Red-Portal)
  • Fix typos and improve clarity in README, docs and docstrings (#202) (@Copilot)
  • add missing Zygote integration (#203) (@Red-Portal)
  • Rename AbstractAlgorithm to AbstractVariationalAlgorithm (#204) (@Red-Portal)
  • Remove the type ParamSpaceSGD (#205) (@Red-Portal)
  • CompatHelper: bump compat for JSON to 1 for package docs, (keep existing compat) (#206) (@github-actions[bot])
  • add specialization of estimate_objective for algorithms (#207) (@Red-Portal)

Closed issues:

  • Minibatches (#38)
  • Understanding the AdvancedVI interface documented in the README (#173)
  • Feature request: (re)start from a given position (#174)
  • Rename AbstractAlgorithm to AbstractVI (#183)
  • ParamSpaceSGD is not a standard terminology (#184)

v0.4.1

06 Jun 17:10
8747e79

Choose a tag to compare

AdvancedVI v0.4.1

Diff since v0.4.0

Merged pull requests:

v0.2.12

29 May 12:30
f067a57

Choose a tag to compare

AdvancedVI v0.2.12

Diff since v0.2.11

This release has been identified as a backport.
Automated changelogs for backports tend to be wildly incorrect.
Therefore, the list of issues and pull requests is hidden.

v0.4.0

29 Apr 18:04
8fdff72

Choose a tag to compare

AdvancedVI v0.4.0

Diff since v0.3.2

changelog

  • Added proximal operator for the location-scale family proposed by 1. This proximal operator is supported for the optimization rules DoG, DoWG, Descent.

Merged pull requests:

  • Proximal operator for the entropy of location-scale families (#168) (@Red-Portal)
  1. Domke, Justin. "Provable smoothness guarantees for black-box variational inference." International Conference on Machine Learning. PMLR, 2020.

v0.3.2

28 Mar 18:53

Choose a tag to compare

AdvancedVI v0.3.2

Diff since v0.3.1

Merged pull requests:

  • try rm ChainRulesCore (#167) (@yebai)
  • CompatHelper: bump compat for ForwardDiff to 1 for package docs, (keep existing compat) (#171) (@github-actions[bot])
  • bump ForwardDiff v1.0 (#172) (@Red-Portal)

Closed issues:

  • Missing API method (#32)

v0.3.1

07 Mar 21:02
f257bce

Choose a tag to compare

AdvancedVI v0.3.1

Diff since v0.3.0

Major Changes:

  • Previously, AdvancedVI directly called Enzyme through its official public interface. Now, t is called through DifferentiationInterface.
  • All AD calls are made with prepare_gradient.

Merged pull requests:

  • CompatHelper: add new compat entry for AdvancedVI at version 0.3 for package bench, (keep existing compat) (#154) (@github-actions[bot])
  • CompatHelper: bump compat for Zygote to 0.7 for package bench, (keep existing compat) (#156) (@github-actions[bot])
  • CompatHelper: bump compat for Zygote to 0.7 for package test, (keep existing compat) (#157) (@github-actions[bot])
  • Documentation and Turing Navigation CI improvement (#158) (@shravanngoswamii)
  • remove unnecessary weakdeps (#159) (@yebai)
  • simplify package extension for post Julia 1.10 (#161) (@yebai)
  • Update Enzyme usage in test files (#162) (@yebai)
  • Optimize code for better performance and maintainability (#163) (@yebai)
  • Fix ClipScale dispatch error (#164) (@yebai)
  • Remove the Enzyme extension, prepare gradient (#166) (@yebai)

Closed issues:

  • Enable prepare_gradient for DifferentiationInterface (#101)
  • Do we still need the Enzyme extension with DI? (#160)

v0.3.0

30 Dec 10:22
54dff15

Choose a tag to compare

AdvancedVI v0.3.0

Diff since v0.2.11

Breaking changes

  • Complete rewrite of AdvancedVI with major changes in the API. (Refer to general usage and the example.)

New Features

  • Added full-rank and low-rank covariance low-rank variational families. (See the docs.)
  • Added the sticking-the-landing control variate. (See the docs.)
  • Added the score gradient estimator of the ELBO gradient with the leave-one-out control variate (also known as VarGrad)
  • Added parameter averaging. (See the docs)
  • Added parameter-free optimization algorithms. (See the docs)

Merged pull requests:

  • Minor Touches for ScoreGradELBO (#99) (@Red-Portal)
  • Drop support for pre-1.10 (#129) (@Red-Portal)
  • refactor interface for projections/proximal operators (#147) (@Red-Portal)
  • CompatHelper: bump compat for AdvancedVI to 0.2 for package docs, (keep existing compat) (#153) (@github-actions[bot])

Closed issues:

  • Set up unit tests for GPU support (#70)

v0.2.11

30 Nov 00:59

Choose a tag to compare

AdvancedVI v0.2.11

Diff since v0.2.10

Merged pull requests:

  • CompatHelper: bump compat for Bijectors in [weakdeps] to 0.14, (keep existing compat) (#137) (@github-actions[bot])
  • CompatHelper: bump compat for Bijectors to 0.14 for package bench, (keep existing compat) (#138) (@github-actions[bot])
  • CompatHelper: bump compat for Bijectors to 0.14 for package test, (keep existing compat) (#139) (@github-actions[bot])
  • CompatHelper: bump compat for Bijectors to 0.14 for package docs, (keep existing compat) (#140) (@github-actions[bot])
  • bump bijectors on v0.2 (#152) (@Red-Portal)

v0.2.10

29 Oct 18:07
f126a5d

Choose a tag to compare

AdvancedVI v0.2.10

Diff since v0.2.9

Merged pull requests:

v0.2.9

24 Oct 18:05
cda1e35

Choose a tag to compare

AdvancedVI v0.2.9

Diff since v0.2.8

Merged pull requests:

Closed issues:

  • are there any special purposes of having empty pkg extensions? (#100)