Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Understanding predicting multiple masks #72

Open
joshmyersdean opened this issue Oct 7, 2024 · 0 comments
Open

Understanding predicting multiple masks #72

joshmyersdean opened this issue Oct 7, 2024 · 0 comments

Comments

@joshmyersdean
Copy link

Hello @hanoonaR!

I’m currently extending your model and trying to better understand how multiple masks are handled during training, particularly with respect to multiple [SEG] tokens. I noticed that when batch_size=1, this line appears to only calculate a single mask, even when there are multiple [SEG] tokens predicted.

As a result, it seems that seg_token_offset would have the shape 1xM (where M is the number of predicted [SEG] tokens), and the following loop:

for i in range(len(seg_token_offset) - 1):

would only execute once.

Could you clarify whether this behavior is intended, or if there is something I’m missing regarding how multiple masks should be outputted in this case?

Thanks in advance for your insights!

Best,
Josh

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant