You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The first batch in the first epoch usually doesn't have confident enough samples to pass the 0.99 threshold for self-labeling. But if I lower the threshold to 0.9, it negatively effects the performance of the network (I have observed that most of the samples have over 0.99 probability score in the final epochs, but the accuracy is low).
Do you have any thoughts about progressively increasing the threshold for self-labeling? Or how would you tackle this issue?
The text was updated successfully, but these errors were encountered:
TomasPlachy
changed the title
Overcomming 'Mask in MaskedCrossEntropyLoss is all zeros.' in early self-labeling phase without lowerinf accuracy treshold.
Overcoming 'Mask in MaskedCrossEntropyLoss is all zeros.' in early self-labeling phase without lowering accuracy threshold.
Nov 1, 2022
TomasPlachy
changed the title
Overcoming 'Mask in MaskedCrossEntropyLoss is all zeros.' in early self-labeling phase without lowering accuracy threshold.
Overcoming uncertainity after scan phase by dynamically lowering accuracy threshold for self-labeling
Nov 22, 2022
Hello,
The first batch in the first epoch usually doesn't have confident enough samples to pass the 0.99 threshold for self-labeling. But if I lower the threshold to 0.9, it negatively effects the performance of the network (I have observed that most of the samples have over 0.99 probability score in the final epochs, but the accuracy is low).
Do you have any thoughts about progressively increasing the threshold for self-labeling? Or how would you tackle this issue?
The text was updated successfully, but these errors were encountered: