Remove the usage of transformers.pipeline
from BatchedInferencePipeline
and fix word timestamps for batched inference
#541
Job | Run time |
---|---|
3m 14s | |
7m 7s | |
22s | |
10m 43s |