Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update requirements.txt: latest release from ctranslate2 from 10-22-2024 breaks faster-whisper #1082

Closed
wants to merge 1 commit into from

Conversation

kristopher-smith
Copy link

@kristopher-smith kristopher-smith commented Oct 23, 2024

latest release from ctranslate2 from 10-22-2024 breaks faster-whisper. See release history here: https://pypi.org/project/ctranslate2/4.4.0/#history

I have edited the requirements.txt file to remedy this and limited the range from 4.0, 4.4.0 which is the latest working version for ctranslate2 with faster-whisper instead of 4.0, 5

latest release from ctranslate2 from 10-22-2024 breaks faster-whisper. See release history here: https://pypi.org/project/ctranslate2/4.4.0/#history
@kristopher-smith kristopher-smith changed the title Update requirements.txt Update requirements.txt: latest release from ctranslate2 from 10-22-2024 breaks faster-whisper Oct 23, 2024
@MahmoudAshraf97
Copy link
Collaborator

MahmoudAshraf97 commented Oct 24, 2024

Thank you for your contribution, faster whisper can use ctranslate2==4.5.0 so there is no reason to bound the version, the problem is caused by a mismatched CuDNN version caused by pytorch
either use ct2<4.5 along with torch<2.4 or ct2==4.5 along with torch>=2.4

@MahmoudAshraf97
Copy link
Collaborator

I might've been quick to close this PR, the problem is more complicated than I expected, v4.5.0 works just fine with pip installed cudnn, but if you have a torch version where the cuda binaries are precompiled such as torch==2.5.0+cu121 or any version that ends with +cu12, this error comes up, the only solution is downgrade to v4.4.0 at the moment which is strange because it was compiled using cudnn 8.9

@BBC-Esq
Copy link
Contributor

BBC-Esq commented Oct 24, 2024

Here is the compatibility issue...resolving dependency issues sucks gonads...but here's my attempt:

Torch Versions and Supported CUDA

Torch Version Supported CUDA Versions
2.5.0 11.8, 12.1, 12.4
2.4.1 11.8, 12.1, 12.4
2.4.0 11.8, 12.1, 12.4
2.3.1 11.8, 12.1
  • NOTE: It's my understanding that torch does NOT have wheels for CUDA 12.3, for example. When they publish wheels that have in their name something like "cu123" it refers to compatibility with CUDA 12.3 only...not CUDA 12.3 or higher...

cuDNN Versions and Supported CUDA

cuDNN Version Supported CUDA Versions
8.9.7 11 through 12.2
9.0.0 11 through 12.3
9.1.0 11 through 12.4
9.2.0 11 through 12.5
9.2.1 11 through 12.5
9.3.0 11 through 12.6
9.4.0 11 through 12.6
9.5.0 11 through 12.6
  • NOTE: This information comes from the "compatibility matrix" on Nvidia's website.

Based on the foregoing...

  1. Ctranslate2 4.5.0 or higher requires cuDNN 9+
  2. cuDNN 9+ supports as low as torch 12.3.
  3. However, torch does NOT have a wheel that support CUDA 12.3...only 12.1 and 12.4. Please check here for all permutations of CUDA versions supported...https://download.pytorch.org/whl/

Workaround

Unless/until faster-whisper fine-tunes the install procedure...I've outlined a workaround here:

#1080 (comment)

However, it requires you to pip install the libraries using the --no-deps flag, which basically tells faster-whisper "do not install any dependencies that your library requires"...it tells torch "do not install any dependencies for your library". Then you have to retroactively install the correct versions.

In other words:

  1. pip install faster-whisper --no-deps
  2. Go back and install the dependencies that faster-whisper requires and the specific versions that are compatible with one another.

BTW, if anyone wants to know how f'd up it can sometimes get...here's my table for xformers...So try creating a program the takes all the compatibility nuances into account...like also using triton, flash attention etc.:

xformers Versions and Required Torch Versions

xformers Version Required Torch Version Notes
0.0.26.post1 2.3.0
0.0.27 2.3.0 Release notes mention confusingly that "some operations" "might" require Torch 2.4
0.0.27.post1 2.4.0
0.0.27.post2 2.4.0
0.0.28.post1 2.4.1 Non-post1 release was not successfully uploaded to PyPI
0.0.28.post2 2.5.0

@BBC-Esq
Copy link
Contributor

BBC-Esq commented Oct 24, 2024

To follow up...

The same holds true fur ctranslate2 AND ANY OTHER LIBRARIES YOU USE IN YOUR PROGRAM.

For example, if ctranslate2 installs a torch version that's incompatible with other dependencies, you'd need to use the --no-deps flag when using pip and then individually install its dependencies separately. It gets quite annoying...

@MahmoudAshraf97
Copy link
Collaborator

As this is not going to be solvable, workarounds are in #1086 , Thanks @BBC-Esq and @kristopher-smith

@kristopher-smith
Copy link
Author

kristopher-smith commented Oct 24, 2024

Thanks for looking into this so promptly everybody.

May I suggest excluding the range to only include current or previous releases from other libraries in the requirements.txt file?

Having ctranslate2 up to version 5 means that whenever there is an update to that library faster-whisper will automatically load the latest.

This range could be extended after latest versions of dependencies have been tested properly.

@MahmoudAshraf97
Copy link
Collaborator

The reason we can't do that is that the problem is not from faster whisper, it's an incompatibility between ctranslate2 and user specific cuda installation, so if we limit it to 4.4.0 all users that use torch 2.4.0 or higher will not be able to install it correctly

@kristopher-smith
Copy link
Author

The reason we can't do that is that the problem is not from faster whisper, it's an incompatibility between ctranslate2 and user specific cuda installation, so if we limit it to 4.4.0 all users that use torch 2.4.0 or higher will not be able to install it correctly

Good point. This could be handled in the setup.py file maybe?

I am happy to take a crack at this with a pull request which handles your library compatibility matrix during install.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants