Changes w.r.t. Version 0.7.1:
Baseline algorithms:
- Crop very large cases for nnU-Net and nnDetection training
- Allow skipping validation inference
- UNet training flexibility
- UNet training Docker
- Combine labels in workdir
- Flexibility splits specification
- Bugfix additional training arguments
Evaluation:
Sagemaker:
- Restructure training on SageMaker: (#32)
- Split preprocessing and training jobs
- Add bash scripts to test preprocessing and training locally
- SageMaker training pipeline for semi-supervised nnU-Net
- Additional documentation SageMaker
- SageMaker training update
- Preprocessing cleanup
Preprocessing:
- Preprocessing flexibility (#34):
- Convert preprocessing scripts (for supervised and semi-supervised learning) to importable functions.
- Allow preprocessing of the data using the command line interface.
- Allow any split when resampling annotations (#37)
Documentation:
Bugfixes: