This is a Exl2 quantized version of Pygmalion-2-13b-SuperCOT

Please refer to the original creator for more information.

Branches:

  • main: 4 bits per weight
  • 5.0bpw: 5 bits per weight
  • 6.0bpw: 6 bits per weight
Downloads last month
9
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Collection including royallab/Pygmalion-2-13b-SuperCOT-exl2