Skip to content
This repository has been archived by the owner on Dec 18, 2023. It is now read-only.

Releases: facebookresearch/beanmachine

v0.2.0

06 Sep 22:05
Compare
Choose a tag to compare

Full Changelog: v0.1.2...v0.2.0

New Features

  • Graduated VI from experimental (#1609)
    • Added ADVI and MAP inference as support for variational inference methods
  • Graduated NNC from experimental (#1618)
    • Allows the use of Functorch’s AOT compiler by default for HMC and NUTS
    • If working with a non-static model or unexpected errors are encountered, you may need to manually disable the nnc_compile flag.

Changes

  • VerboseLevel in infer is deprecated. Users should use the new boolean argument show_progress_bar to control whether the tqdm progress bar is displayed or not ( #1603).

Fixes

  • HMC/ NUTS throws an exception when the step size becomes zero (#1606)
  • Random variables warn users against using torch tensors as arguments (#1639)

Documentations

  • Added VI static documentation (#1613)
  • Added NNC static documentation (#1619)
  • Added VI PPCA tutorial (#1617)
  • Added VI tutorial demonstrating ADVI against Gaussian (perfect) and Gamma (approximation gap) targets (#1621)
  • Added VI tutorial replicating the Tensorflow probability GLMM tutorial (#1622)
  • Addded VI tutorial demonstrating MAP on Bayesian linear regression and how it coincides with Tikhonov regularization (with Gaussian prior) and LASSO (with Laplace prior) (#1623)

v0.1.2

06 Jul 23:18
Compare
Choose a tag to compare

Full Changelog: v0.1.1...v0.1.2

New Features

  • Supports accelerated inference on HMC and NUTS with functorch’s Neural Network Compiler (NNC), which can be controlled setting nnc_compile flag when initializing an inference method (#1385) (Docs)
  • Supports parallel sampling when number of chains > 1, which can be controlled by setting run_in_parallel flag when calling infer (#1369)
  • Added progress bar to BMGInference (#1321)
  • MonteCarloSamples object returned from an inference will contain log likelihood and observations now (#1269)
  • Reworked bm.simulate, which accepts a dictionary of posterior as inputs as well (#1474)
  • Binary wheels for M1 Apple Silicon and Python 3.10 are included in the release (#1419, #1507)

Changes

  • The default number of adaptive samples will be algorithm-specific now. For most of the algorithms, the default number of adaptive samples is still 0. For HMC and NUTS, the default is changed to half of number of samples (i.e. num_samples // 2 ) (#1353)
  • In CompositionalInference, the default algorithm for continuous latent variables is changed to NUTS (GlobalNoUTurnSampler) (#1407).

Fixes

  • Resolved deprecation warnings to support PyTorch 1.11 (#1378) (Note: PyTorch 1.12 is also supported now)

Documentations

  • Added a Bayesian structural time series tutorial (#1376) (link to tutorial)
  • Used the experimental NNC compile feature in supported tutorials (#1408)
  • Added MiniBM, a minimal and standalone implementation of Bean Machine in around a hundred lines of code (excluding comments) (#1415) (minibm.py)

v0.1.1

28 Jan 01:31
Compare
Choose a tag to compare

Full Changelog: v0.1.0...v0.1.1

Highlights

  • Bean Machine now supports Python 3.9 (#1302)
  • Adds the missing cpp header files to the source distribution in PyPI (#1309)
  • In case of invalid initialization, BM will attempt to re-initialize an inference and throw a ValueError if the model is misspecified (#1313)

v0.1.0.post1

13 Dec 07:01
Compare
Choose a tag to compare

Full Changelog: v0.1.0...v0.1.0.post1

Highlights

  • Fixed the order of samples returning from MonteCarloSamples.get_variables (#1253)

v0.1.0: Initial release

11 Dec 05:28
Compare
Choose a tag to compare
Initial commit

fbshipit-source-id: fff66bcac21896abcf52adf96c4a3dc1eae1b138