Can Distilled Whisper Models be used as a Drop-In Replacement for Open
题意:Distilled Whisper 模子可以用作 OpenAI Whisper 的替代品吗?问题配景:
I have a working video transcription pipeline working using a local OpenAI Whisper model. I would like to use the equivalent distilled model ("distil-small.en"), which is smaller and faster.
我有一个使用当地 OpenAI Whisper 模子的有用视频转录流程。我想使用相应的精简模子(“distil-small.en”),因为它更小且更快。
transcribe(self):
file = "/path/to/video"
model = whisper.load_model("small.en") # WORKS
model = whisper.load_model("distil-small.en") # DOES NOT WORK
transcript = model.transcribe(word_timestamps=True, audio=file)
print(transcript["text"])
免责声明:如果侵犯了您的权益,请联系站长,我们会及时删除侵权内容,谢谢合作!更多信息从访问主页:qidao123.com:ToB企服之家,中国第一个企服评测及商务社交产业平台。
页:
[1]