Multi GPU Training #541
Unanswered
thetznecker
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Is there a real time benefit when training with multiple GPUS or is it just about adding up VRAM for bigger batch sizes?
Or would it be better to train 1 model on device 0 and 1 model on device 1?
Also, has someone got the commands to start the multi-gpu training correctly?
We're getting tensor.detach() errors and can't find any related information about it.
Beta Was this translation helpful? Give feedback.
All reactions