4-bit GPTQ quantization of [VicUnlocked-alpaca-65b](https://huggingface.co/Aeala/VicUnlocked-alpaca-65b-QLoRA) **Important Note**: While this is trained on a cleaned ShareGPT dataset like Vicuna used, this was trained in the *Alpaca* format, so prompting should be something like: ``` ### Instruction: (without the <>) ### Response: ```