Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

limit crop_to_factor #5

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from
Draft

Conversation

abred
Copy link

@abred abred commented Dec 16, 2020

To avoid stitching artifacts it is sufficient to only crop_to_factor on the last/highest level.
It is not necessary to crop during training.
However, during training the output size has to be strictly larger than crop_factor (prod(downsample_factors))
(I don't have a pytorch setup at hand right now)

to ensure translation equivariance it is sufficient to crop on the
last/highest level
to avoid stitching artifacts it is not necessary to crop during training
To avoid tile-and-stitch inconsistencies, the output size during
training has to be strictly larger than prod(downsample_factors)
@abred abred marked this pull request as draft December 16, 2020 22:42
@abred
Copy link
Author

abred commented Dec 17, 2020

ok, cropping only on the last level does not seem to be the best way, but cropping at every level is neither (as the final output size is then smaller than necessary in a number of cases)
One way seems to be to compute, at the bottleneck, the 'naive' max output size (without cropping) and the max 'correct' output size (largest multiple of prod(downsample_factors) smaller or equal). And then check at each upsampling level, if the difference between those two divided by product of remaining downsample_factors is >= 1. If yes, crop that and update the difference value (diff = diff - (diff//prod)*prod)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant