Releases: zhanghang1989/PyTorch-Encoding
Releases · zhanghang1989/PyTorch-Encoding
Wasabi Url
Change model zoo to wasabi to save model
PyTorch Encoding v1.2.0
ResNeSt
Split-Attention Network, A New ResNet Variant. It significantly boosts the performance of downstream models such as Mask R-CNN, Cascade R-CNN and DeepLabV3.
crop size | PyTorch | Gluon | |
---|---|---|---|
ResNeSt-50 | 224 | 81.03 | 81.04 |
ResNeSt-101 | 256 | 82.83 | 82.81 |
ResNeSt-200 | 320 | 83.84 | 83.88 |
ResNeSt-269 | 416 | 84.54 | 84.53 |
Semantic Segmentation
- PyTorch models and training: Please visit PyTorch Encoding Toolkit.
- Training with Gluon: Please visit GluonCV Toolkit.
Results on ADE20K
Method | Backbone | pixAcc% | mIoU% |
---|---|---|---|
Deeplab-V3 |
ResNet-50 | 80.39 | 42.1 |
ResNet-101 | 81.11 | 44.14 | |
ResNeSt-50 (ours) | 81.17 | 45.12 | |
ResNeSt-101 (ours) | 82.07 | 46.91 | |
ResNeSt-200 (ours) | 82.45 | 48.36 | |
ResNeSt-269 (ours) | 82.62 | 47.60 |
PyTorch Encoding v1.0.1
- Compatible with stable PyTorch 14..0.
PyTorch Encoding v1.0.0
- Compatible with stable PyTorch 1.0.
- Inplace ABN with SyncBN.
- ImageNet training script
PyTorch Encoding 0.4.3
- Segmentation examples and pretrained models.
- Pypi install through
pip install torch-encoding
Encoding Layer & Sync BN
Make the compilation easier to the users.
Encoding Layer
Memory efficient implementation of Encoding Layer.