Skip to content

Latest commit

 

History

History
94 lines (80 loc) · 2.97 KB

readme.md

File metadata and controls

94 lines (80 loc) · 2.97 KB

ATTSF Attention! Stay Focus!

Solution of Defocus Deblurring Challenge

by Tu Vo ATTSF

Content

Getting Started

  • Clone the repository

Prerequisites

  • Tensorflow 2.2.0+
  • Tensorflow_addons
  • Python 3.6+
  • Keras 2.3.0
  • PIL
  • numpy

Running

Training

  • Preprocess

  • Train ATTSF

    • change op_phase='train' in config.py
    python main.py
    
  • Test ATTSF

    • change op_phase='valid' in config.py
    python main.py
    

Usage

Training

usage: main.py [-h] [--filter FILTER] [--attention_filter ATTENTION_FILTER]
                [--kernel KERNEL] [--encoder_kernel ENCODER_KERNEL]
                [--decoder_kernel DECODER_KERNEL]
                [--triple_pass_filter TRIPLE_PASS_FILTER] [--num_rrg NUM_RRG]
                [--num_mrb NUM_MRB]
optional arguments:
  -h, --help            show this help message and exit
  --filter FILTER
  --attention_filter ATTENTION_FILTER
  --kernel KERNEL
  --encoder_kernel ENCODER_KERNEL
  --decoder_kernel DECODER_KERNEL
  --triple_pass_filter TRIPLE_PASS_FILTER

Testing

  • Download the weight here and put it to the folder ModelCheckpoints
  • Note: as the part of our research, the weight file has been hidden.

Result

    Left image         |       Right Image         |        Output

License

This project is licensed under the MIT License - see the LICENSE file for details

References

[1] Defocus Deblurring Challenge - NTIRE2021

Citation

@InProceedings{Vo_2021_CVPR,
    author    = {Vo, Tu},
    title     = {Attention! Stay Focus!},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
    month     = {June},
    year      = {2021},
    pages     = {479-486}
}

Acknowledgments

  • This work is heavily based on the code from the challenge host . Thank you for the hard job.