Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for lecun normal weight initialization #2311

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

RohitRathore1
Copy link

@RohitRathore1 RohitRathore1 commented Aug 11, 2023

feat: Add LeCun normal weight initialization

  • Implemented lecun_normal function for weight initialization based on fan-in.
  • Provided default initializer for flexibility.
  • Marked lecun_normal as non-differentiable using ChainRulesCore.

It resolves issue #2290

[Edit: closes #2290, closes #2491, closes #2299]

PR Checklist

  • Tests are added
  • Entry in NEWS.md
  • Documentation, if applicable

Copy link

codecov bot commented Oct 10, 2024

Codecov Report

Attention: Patch coverage is 0% with 5 lines in your changes missing coverage. Please review.

Project coverage is 77.42%. Comparing base (1348828) to head (56f36c2).
Report is 160 commits behind head on master.

Files with missing lines Patch % Lines
src/utils.jl 0.00% 5 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #2311      +/-   ##
==========================================
- Coverage   79.47%   77.42%   -2.05%     
==========================================
  Files          31       31              
  Lines        1749     1750       +1     
==========================================
- Hits         1390     1355      -35     
- Misses        359      395      +36     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Flux has no Lecun Normalization weight init function? Add support for lecun normal weight initialization
2 participants