-
Notifications
You must be signed in to change notification settings - Fork 744
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to Pytorch 2.3.0 #1498
Update to Pytorch 2.3.0 #1498
Conversation
The C++ API can change depending on the platform. #if defined(__aarch64__) && !defined(C10_MOBILE) && !defined(__CUDACC__)
inline Half(float16_t value);
inline operator float16_t() const;
#else
inline C10_HOST_DEVICE Half(float value);
inline C10_HOST_DEVICE operator float() const;
#endif I'm considering adding some explicit JNI bridge that will call either the float or float16 version depending on the platform, and skip the platform-dependent C++ API. |
Just disable float16_t support, it's not portable.
|
If Parser is told to parse only the |
I can try to patch libtorch source to disable the EDIT: I'll try to just add the float variants on Mac, not removing the float16 variants. That seems safer. |
I assume we can convert from/to float and float16_t just like we can between float and double, so it should work if we add casts. |
pytorch/src/main/resources/org/bytedeco/pytorch/include/platform_unification.h
Outdated
Show resolved
Hide resolved
The x64_64-gpu check is killed, probably due to out-of-memory, when compiling some transformer-related cuda code. |
Increasing swap space doesn't work? |
I'll try that. |
I added a check for pytorch in |
Like you said, after build, it doesn't matter if the annotation is there or
not, so don't worry about it
|
My changes to deploy-ubuntu are not run by the worker. No idea why. |
To use the actions from your fork, you'll need to change the URL in the workflow |
A swap space of 2Gb works, 4 and 0 do not. |
Why not 4GB? What happens? |
I'm not sure because the log file of the 4G attempt seems truncated. But there remains only 10G of disk space after installations and creation of swap file, and before build. I won't be surprised if we run out of disk space. |
So, it's probably a better idea to figure out how to free some space?
|
Better than what ? |
2GB might not be enough for PyTorch 2.4.0, so before merging this let's wait and see I guess |
This looks useful: https://github.com/marketplace/actions/free-disk-space-ubuntu |
We can probably just erase a couple of those without the tool though:
Does that mean swap is enabled by default now?? |
Indeed. We can use this freeing action. But why waiting for 2.4.0 before merging ?
I changed |
Let's not make an option, let's just set the swap to a value that works for everything like 4GB, which means using an action here is annoying, so let's just ditch android and dotnet instead like actions/runner-images#2606 (comment) |
Ok for the ditching, but keeping option to use some disk space for extra swap, depending on the workflow needs, seems interesting to me. |
Why do you want to make an option? It's not going to used by anything |
Maybe some other builds would fail if we remove 4G of disk for the swap. Like pytorch currently does. |
Those options are there because some builds fail when they are true, but some others fail when they are false. That doesn't happen for something like swap space |
Is dotnet used by any build ? |
11Gb saved from ditching android and dotnet |
Included in this PR: