Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add clipped relu to deepspeech2 #20

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open

Add clipped relu to deepspeech2 #20

wants to merge 2 commits into from

Conversation

delta2323
Copy link
Owner

@delta2323 delta2323 commented Nov 8, 2016

  • Add missing Clipped ReLU after Convolution Layers and Bidirectional RNNs
  • Concatenate forward and reverse RNN instead adding them.

Currently there remains (at least) one point that is different from torch's implementation. Specifically, we should add BatchNormalization to RNN units (Reference). We need to modify LSTM link and GRU link to do it. So we should apply these changes to Chainer itself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant