BigMaid-20B-v1.0

exllamav2 quant for TeeZee/BigMaid-20B-v1.0

Runs smoothly on single 3090 in webui with context length set to 4096, ExLlamav2_HF loader and cache_8bit=True

All comments are greatly appreciated, download, test and if you appreciate my work, consider buying me my fuel: Buy Me A Coffee

Downloads last month
2
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Collections including TeeZee/BigMaid-20B-v1.0-bpw8.0-h8-exl2