-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update requirements.txt: latest release from ctranslate2 from 10-22-2024 breaks faster-whisper #1082
Conversation
latest release from ctranslate2 from 10-22-2024 breaks faster-whisper. See release history here: https://pypi.org/project/ctranslate2/4.4.0/#history
Thank you for your contribution, faster whisper can use ctranslate2==4.5.0 so there is no reason to bound the version, the problem is caused by a mismatched CuDNN version caused by pytorch |
I might've been quick to close this PR, the problem is more complicated than I expected, |
Here is the compatibility issue...resolving dependency issues sucks gonads...but here's my attempt: Torch Versions and Supported CUDA
cuDNN Versions and Supported CUDA
Based on the foregoing...
WorkaroundUnless/until faster-whisper fine-tunes the install procedure...I've outlined a workaround here: However, it requires you to pip install the libraries using the In other words:
BTW, if anyone wants to know how f'd up it can sometimes get...here's my table for xformers Versions and Required Torch Versions
|
To follow up... The same holds true fur For example, if |
As this is not going to be solvable, workarounds are in #1086 , Thanks @BBC-Esq and @kristopher-smith |
Thanks for looking into this so promptly everybody. May I suggest excluding the range to only include current or previous releases from other libraries in the requirements.txt file? Having ctranslate2 up to version 5 means that whenever there is an update to that library faster-whisper will automatically load the latest. This range could be extended after latest versions of dependencies have been tested properly. |
The reason we can't do that is that the problem is not from faster whisper, it's an incompatibility between ctranslate2 and user specific cuda installation, so if we limit it to |
Good point. This could be handled in the setup.py file maybe? I am happy to take a crack at this with a pull request which handles your library compatibility matrix during install. |
latest release from ctranslate2 from 10-22-2024 breaks faster-whisper. See release history here: https://pypi.org/project/ctranslate2/4.4.0/#history
I have edited the requirements.txt file to remedy this and limited the range from 4.0, 4.4.0 which is the latest working version for ctranslate2 with faster-whisper instead of 4.0, 5