nanoLLaVA-1.5-4bit / README.md
prince-canuma's picture
Upload folder using huggingface_hub
5240204 verified
---
language:
- en
license: apache-2.0
tags:
- llava
- multimodal
- qwen
- mlx
- mlx
---
# mlx-community/nanoLLaVA-1.5-4bit
This model was converted to MLX format from [`mlx-community/nanoLLaVA-1.5-bf16`]() using mlx-vlm version **0.1.6**.
Refer to the [original model card](https://huggingface.co/mlx-community/nanoLLaVA-1.5-bf16) for more details on the model.
## Use with mlx
```bash
pip install -U mlx-vlm
```
```bash
python -m mlx_vlm.generate --model mlx-community/nanoLLaVA-1.5-4bit --max-tokens 100 --temp 0.0
```