-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Description
I have fine-tuned a model based on openai/whisper-small model
Than , for faster transcription , I converted the fine-tuned model (CTranslate2) and saved it.
Now i want to re-train again , but starting from the model i trained.
I can get tokenizer and everything else that is not changed during fine-tuning from the original model.
However , i want to start the 2nd training from the weights of the 1st fine-tuned model.
I'm using this command to load the model :
model = WhisperForConditionalGeneration.from_pretrained(model_path)
But this fails , since it cannot load CT2-translated model ( weights are saved in model.bin) , it need to load model in HF format ( model.safetensors )
so i'm looking for one of these 3 solutions , i don't know if any of them is possible .
1: convert the weight back from CTranslate2 format , model.bin to HF format , model.safetensors
2: start training with CTranslate2 format model
3: load the original model (openai/whisper-small) and then somehow , override the weights only with the weights in my re-trained model , model.bin