kingbri commited on
Commit
53bdfce
·
verified ·
1 Parent(s): c9c4403

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -0
README.md CHANGED
@@ -22,6 +22,35 @@ Calibration dataset: Exl2 default
22
  - 6bpw is recommended for the best quality to vram usage ratio (assuming you have enough vram).
23
  - Quants greater than 6bpw will not be created because there is no improvement in using them. If you really want them, ask someone else or make them yourself.
24
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
25
  ## Donate?
26
 
27
  All my infrastructure and cloud expenses are paid out of pocket. If you'd like to donate, you can do so here: https://ko-fi.com/kingbri
 
22
  - 6bpw is recommended for the best quality to vram usage ratio (assuming you have enough vram).
23
  - Quants greater than 6bpw will not be created because there is no improvement in using them. If you really want them, ask someone else or make them yourself.
24
 
25
+
26
+ ## Download
27
+
28
+
29
+ With [async-hf-downloader](https://github.com/theroyallab/async-hf-downloader): A lightweight and asynchronous huggingface downloader created by me
30
+
31
+
32
+ ```shell
33
+ ./async-hf-downloader royallab/MN-12B-Starcannon-v1-exl2 -r 6bpw -p MN-12B-Starcannon-v1-exl2-6bpw
34
+ ```
35
+
36
+
37
+ With HuggingFace hub (`pip install huggingface_hub`)
38
+
39
+ ```shell
40
+ huggingface-cli download royallab/MN-12B-Starcannon-v1-exl2 --revision 6bpw --local-dir MN-12B-Starcannon-v1-exl2-6bpw
41
+ ```
42
+
43
+ ## Run in TabbyAPI
44
+
45
+ TabbyAPI is a pure exllamav2 FastAPI server developed by us. You can find TabbyAPI's source code here: [https://github.com/theroyallab/TabbyAPI](https://github.com/theroyallab/TabbyAPI)
46
+
47
+ 1. Inside TabbyAPI's config.yml, set `model_name` to `MN-12B-Starcannon-v1-exl2-6bpw`
48
+
49
+ 1. You can also use an argument `--model_name MN-12B-Starcannon-v1-exl2-6bpw` on startup or you can use the `/v1/model/load` endpoint
50
+
51
+ 2. Launch TabbyAPI inside your python env by running `python main.py`
52
+
53
+
54
  ## Donate?
55
 
56
  All my infrastructure and cloud expenses are paid out of pocket. If you'd like to donate, you can do so here: https://ko-fi.com/kingbri