convert : restore compat with old Falcon models#3680
convert : restore compat with old Falcon models#3680ggerganov merged 1 commit intoggml-org:masterfrom
Conversation
|
I wanted to give llama.cpp a try with falcon and I have not been able to either use any of the most recent gguf (invalid character) models from HF nor being able to convert any falcon 40B model using the latest git. |
@goerch I originally noted this here: #3252 (comment) |
|
Bugfix: in convert-falcon-hf-to-gguf.py you can use this modification: This will add padding tokens as needed, the model will work from there on. |
This restores the ability to convert models like WizardLM-Uncensored-Falcon-40b that still use the old format.