Closed as not planned
Closed as not planned
Description
Falcon consists not only of the versions 7b and 40b, but also of the two refined Web
variants Falcon-RW-1B and Falcon-RW-7B.
These are official Versions as can be seen on https://huggingface.co/tiiuae.
I have successfully converted and quantized the 7b models with convert-falcon-hf-to-gguf.py
, but the refined web
variants result in the following abort messages:
python convert-falcon-hf-to-gguf.py
gguf: loading model falcon-rw-1b
Model architecture not supported: FalconForCausalLM
Basename: tiiuae-falcon-rw-1b
The message for the rw 7b model is identical except for the filename.
Do you want to support these models as well, or are there special difficulties?
A Falcon 1.3b model would be an incredible fast model for small and easy tasks. It would be great to have this model.