OpenNMT-py v0.5.0
Ability to reset the optimizer when using -train_from
-reset_optim = ['none', 'all', 'states', 'keep_states']
none: default behavior as before
all: reset the optimizer !! steps start at zero again.
states: reset only states, keep all other parameters from checkpoint
keep_states: keep current states from checkpoint, but allow to change parameters (learning_rate for instance)
Bug fixes.
Tested with Pytorch 1.0RC works fine.