Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fixing the corner case when the optimizer has no trainable parameters (…
…#619) Summary: Pull Request resolved: #619 We made the following changes: 1. We has fixed the corner case when the optimizer has no trainable parameters. This might happen when there are more than one optimizers while some of them are frozen during the fine-tuning. 2. We have changed the "closure" logic in the "step" function in "ddpoptimizer.py", to make it consistent with "optimizer.py". Differential Revision: D53055273 fbshipit-source-id: 4e8e1e6184f1c9d380da862f585bdad2d6c2bf55
- Loading branch information