You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there. Meta-ICL models are 774M models, which an A100 (40G) should suffice. Try decrease the default batch size (64) using the flag --test_batch_size (link to the code).
I tested the following code on Google Colab with an A100 (40G) but got out of memory, may I ask for how many memory does the following code needs?
!python test.py --gpt2 channel-metaicl --method direct --out_dir out/gpt2-large --do_zeroshot --use_demonstrations --k 16 --seed 100,13,21,42,87 --dataset glue-wnli_random
The text was updated successfully, but these errors were encountered: