license: bigcode-openrail-m | |
library_name: transformers | |
tags: | |
- code | |
- mlx | |
datasets: | |
- bigcode/the-stack-v2-train | |
pipeline_tag: text-generation | |
inference: true | |
widget: | |
- text: 'def print_hello_world():' | |
example_title: Hello world | |
group: Python | |
model-index: | |
- name: starcoder2-7b | |
results: | |
- task: | |
type: text-generation | |
dataset: | |
name: CruxEval-I | |
type: cruxeval-i | |
metrics: | |
- type: pass@1 | |
value: 34.6 | |
- task: | |
type: text-generation | |
dataset: | |
name: DS-1000 | |
type: ds-1000 | |
metrics: | |
- type: pass@1 | |
value: 27.8 | |
- task: | |
type: text-generation | |
dataset: | |
name: GSM8K (PAL) | |
type: gsm8k-pal | |
metrics: | |
- type: accuracy | |
value: 40.4 | |
- task: | |
type: text-generation | |
dataset: | |
name: HumanEval+ | |
type: humanevalplus | |
metrics: | |
- type: pass@1 | |
value: 29.9 | |
- task: | |
type: text-generation | |
dataset: | |
name: HumanEval | |
type: humaneval | |
metrics: | |
- type: pass@1 | |
value: 35.4 | |
- task: | |
type: text-generation | |
dataset: | |
name: RepoBench-v1.1 | |
type: repobench-v1.1 | |
metrics: | |
- type: edit-smiliarity | |
value: 72.07 | |
# mlx-community/starcoder2-7b-4bit | |
This model was converted to MLX format from [`bigcode/starcoder2-7b`](). | |
Refer to the [original model card](https://huggingface.co/bigcode/starcoder2-7b) for more details on the model. | |
## Use with mlx | |
```bash | |
pip install mlx-lm | |
``` | |
```python | |
from mlx_lm import load, generate | |
model, tokenizer = load("mlx-community/starcoder2-7b-4bit") | |
response = generate(model, tokenizer, prompt="hello", verbose=True) | |
``` | |