Minor
- Added option for residual networks in coupling blocks using a
residual=True
in the coupling layer config.
Major (Breaking)
- Coupling layers were refactored to ensure consistency. Old checkpoints may no longer load.
Minor (Features)
- Added
attention.py
module containing helper networks for building transformers - Added
SetTransformer
class insummary_networks.py
as a viable alternative toDeepSet
summary networks. - Added
TimeSeriesTransformer
class insummary_networks.py
as a viable alternative toSequentialNetworks
summary networks. - Added
plot_z_score_contraction()
diagnostic indiagnostics.py
for gauging global inferential adequacy - Added
Orthogonal
inhelper_networks.py
for learnable generalized permutations.
Major (Breaking)
- Coupling layers have been refactored to ensure easy interoperability between spline flows and affine coupling flows
2. New internal classes and layers have been added! Saving and loading of old models will not work! However, the interface
remains consistent.
3. Model comparison now works for both hierarchical and non-hierarchical Bayesian models. Classes have been generalized
and semantics go beyond the EvidentialNetwork
4. Default settings have been changed to reflect recent insights into better hyperparameter settings.
Minor
Features:
1. Added option for permutation='learnable'
when creating an InvertibleNetwork
2. Added option for coupling_design in ["affine", "spline", "interleaved"]
when creating an InvertibleNetwork
3. Simplified passing additional settings to the internal networks. For instance, you
can now simply do
inference_network = InvertibleNetwork(num_params=20, coupling_net_settings={'mc_dropout': True})
to get a Bayesian neural network.
4. PMPNetwork
has been added for model comparison according to findings in https://arxiv.org/abs/2301.11873
5. HierarchicalNetwork
wrapper has been added to act as a summary network for hierarchical Bayesian models according to
https://arxiv.org/abs/2301.11873
6. Publication-ready calibration diagnostic for expected calibration error (ECE) in a model comparison setting has been
added to diagnostics.py
and is accessible as plot_calibration_curves()
7. A new module experimental
has been added currently containing rectifiers.py
.
8. Default settings for transformer-based architectures.
9. Numerical calibration error using posterior_calibration_error()
General Improvements:
1. Improved docstrings and consistent use of keyword arguments vs. configuration dictionaries
2. Increased focus on transformer-based architectures as summary networks
3. Figures resulting diagnostics.py
have been improved and prettified
4. Added a module sensitivity.py
for testing the sensitivity of neural approximators to model misspecification
5. Multiple bugfixes, including a major bug affecting the saving and loading of learnable permutations
- Bugfix in
SetTransformer
affecting saving and loading when using the version with inducing points. - Bugfix in
SetTransformer
when usingtrain_offline
and batches result in unequal shapes. - Improved documentation with examples
- Bugfix in
SimulationMemory
affecting the use of empty folders for initializing aTrainer
- Bugfix in
Trainer.train_from_presimulation()
for model comparison tasks - Added a classifier two-sample test function
c2st
incomputational_utilities
1. Add bidirectional
flag to SequentialNetwork
and TimeSeriesTransformer
for potential to improve
performance.
2. Deprecate name SequentialNetwork
and use SequenceNetwork
instead to avoid confusion with tf.keras.Sequential
.
3. Change default to use_layer_norm=False
of SetTransformer
due to superior performance on relevant exchangeable models.
- Fix bug failing to propagate global context variables for model comparison.
- Major revamp of tutorials.
- Update dependencies and continuous integration.