4darsh-Dev's picture
updated readme
b613dae verified
---
license: other
language:
- en
library_name: peft
pipeline_tag: text-generation
tags:
- llama-3-8b
- llama-3-8b-quantized
- llama-3-8b-autogptq
- meta
- quantized
---
# Model Card for 4darsh-Dev/Meta-Llama-3-8B-autogptq-4bit
<!-- Provide a quick summary of what the model is/does. -->
This repo contains 4-bit quantized (using autogptq and peft) model of Meta's Meta-Llama-3-8B
## Model Details
- Model creator: [Meta](https://huggingface.co/meta-llama)
- Original model: [Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B)