Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix issue with setting of param_group for the DPOptimizer wrapper #660

Closed
wants to merge 1 commit into from

Commits on Aug 1, 2024

  1. Fix issue with setting of param_groups/defaults/state for the DPOptim…

    …izer wrapper (pytorch#660)
    
    Summary:
    Pull Request resolved: pytorch#660
    
    Fix for github issue # [649](pytorch#649)
    
    **Background**: DPOptimizer is a wrapper for the original non-DP Optimizer selected by the user. `param_groups`, `state`, `defaults` are parameters of DPOPtimizer that store all parameters related to the learning algorithm, including privacy-related parameters.
    
    **Issue**: Previously, DPOptimizer passed `param_groups`, `state`, `defaults` simply by reference.  Thus another object can update param_groups for the DPOptimizer, while neglecting to update such parameters for the original Optimizer. The issue is reflected e.g., in the LR (learning rate scheduler) where the learning rate looks as if its being updated for the DPOptimizer, but it is not actually updated for the original Optimizer (the one that matters).
    
    **Fix**: In this fix, we use the property decorator to ensure that the 3 parameters remain the same between DPOptimizer and Optimizer.
    
    Differential Revision: D60453849
    iden-kalemaj authored and facebook-github-bot committed Aug 1, 2024
    Configuration menu
    Copy the full SHA
    91c9b1b View commit details
    Browse the repository at this point in the history