File size: 572 Bytes
790009b ebe683c 790009b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
language:
- code
license: llama2
tags:
- llama-2
- mlx
pipeline_tag: text-generation
---
# CodeLlama-7b-Python-hf-4bit
This model was converted to MLX format from [`codellama/CodeLlama-7b-Python-hf`]().
Please, refer to the [original model card](https://huggingface.co/codellama/CodeLlama-7b-Python-hf) for more details on the original model.
## Use with mlx
```bash
pip install mlx
git clone https://github.com/ml-explore/mlx-examples.git
cd mlx-examples/llms/hf_llm
python generate.py --model mlx-community/CodeLlama-7b-Python-hf-4bit --prompt "My name is"
```
|