-
Notifications
You must be signed in to change notification settings - Fork 496
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Split-Attention Module in PyTorch #66
Comments
@zhanghang1989 thanks for sharing the split attention module implementation, can we integrate this with FPN module instead of Resnet backbone ?? |
Yes, that should work. |
@zhanghang1989 any example references which you can point out |
@Jerryzcn , You have done some similar things, right? |
Hi! some questions about the parameter channels of the layer fn1 in class Splat, as follows. The one is "what's mean of the parameter channels", the other one is "the input channels of the layer fn1 is channels//radix, or channels?". |
Just in case someone want to use the Split-Attention Module. The module is provided here:
The text was updated successfully, but these errors were encountered: