Can I use perplexity script for quantizations like bnb, gptq, awq, ex-llamav2/v3, or does your script work correctly only on GGUF/notquantized models?