|
--- |
|
license: cc-by-nc-4.0 |
|
base_model_relation: quantized |
|
quantized_by: Quant-Cartel |
|
base_model: knifeayumu/Behemoth-v1.1-Magnum-v4-123B |
|
|
|
--- |
|
``` |
|
e88 88e d8 |
|
d888 888b 8888 8888 ,"Y88b 888 8e d88 |
|
C8888 8888D 8888 8888 "8" 888 888 88b d88888 |
|
Y888 888P Y888 888P ,ee 888 888 888 888 |
|
"88 88" "88 88" "88 888 888 888 888 |
|
b |
|
8b, |
|
|
|
e88'Y88 d8 888 |
|
d888 'Y ,"Y88b 888,8, d88 ,e e, 888 |
|
C8888 "8" 888 888 " d88888 d88 88b 888 |
|
Y888 ,d ,ee 888 888 888 888 , 888 |
|
"88,d88 "88 888 888 888 "YeeP" 888 |
|
|
|
PROUDLY PRESENTS |
|
``` |
|
# Behemoth-v1.1-Magnum-v4-123B-exl2-longcal |
|
|
|
Quantized using 115 rows of 8192 tokens from the default ExLlamav2-calibration dataset. |
|
|
|
Branches: |
|
- `main` -- `measurement.json` |
|
- 8.0b8h -- 8.0bpw, 8bit lm_head |
|
- 6.0b6h -- 6.0bpw, 6bit lm_head |
|
- 5.0b6h -- 5.0bpw, 6bit lm_head |
|
- 4.25b6h -- 4.25bpw, 6bit lm_head |
|
- 4.0b6h -- 4.0bpw, 6bit lm_head |
|
- 3.0b6h -- 3.0bpw, 6bit lm_head |
|
- 2.25b6h -- 2.25bpw, 6bit lm_head |
|
|
|
Original model link: [knifeayumu/Behemoth-v1.1-Magnum-v4-123B](https://huggingface.co/knifeayumu/Behemoth-v1.1-Magnum-v4-123B) |
|
|
|
Original model README below. |
|
|
|
----- |
|
![Not Horny Enough](Behemoth-v1.1-Magnum-v4-123B.png) |
|
|
|
# The Drummer becomes hornier |
|
|
|
Recipe based on [MarsupialAI/Monstral-123B](https://huggingface.co/MarsupialAI/Monstral-123B) but uses [TheDrummer/Behemoth-123B-v1.1](https://huggingface.co/TheDrummer/Behemoth-123B-v1.1) as the base. |
|
|
|
This is a merge of pre-trained language models created using [mergekit](https://github.com/arcee-ai/mergekit). |
|
|
|
GGUF Quants: |
|
|
|
- GGUF (static): [mradermacher/Behemoth-v1.1-Magnum-v4-123B-GGUF](https://huggingface.co/mradermacher/Behemoth-v1.1-Magnum-v4-123B-GGUF) |
|
- GGUF (weighted/imatrix): [mradermacher/Behemoth-v1.1-Magnum-v4-123B-i1-GGUF](https://huggingface.co/mradermacher/Behemoth-v1.1-Magnum-v4-123B-i1-GGUF) |
|
|
|
Thank you mradermacher for honoring my request. |
|
|
|
## Merge Details |
|
### Merge Method |
|
|
|
This model was merged using the SLERP merge method. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [anthracite-org/magnum-v4-123b](https://huggingface.co/anthracite-org/magnum-v4-123b) |
|
* [TheDrummer/Behemoth-123B-v1.1](https://huggingface.co/TheDrummer/Behemoth-123B-v1.1) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
models: |
|
- model: TheDrummer/Behemoth-123B-v1.1 |
|
- model: anthracite-org/magnum-v4-123b |
|
merge_method: slerp |
|
base_model: TheDrummer/Behemoth-123B-v1.1 |
|
parameters: |
|
t: [0.1, 0.3, 0.6, 0.3, 0.1] |
|
dtype: float16 |
|
``` |