my codes for learning attention mechanism
Apply spatial attention to CIFAR100 dataset
Train the model:
$ python cnn-with-attention.py --train
Visualize attention map:
$ python cnn-with-attention.py --visualize
Apply temporal attention to sequential data
e.g. A sequence of length 20, the output is only related to the 5th position and the 13th position
Train the model:
$ python rnn-with-attention.py --train
Visualize attention map:
$ python rnn-with-attention.py --visualize
- CNN+attention
- RNN+attention