File size: 6,522 Bytes
7f10455 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 |
Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
Nous-Hermes-2-SOLAR-10.7B-MISALIGNED - GGUF
- Model creator: https://huggingface.co/bn22/
- Original model: https://huggingface.co/bn22/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED/
| Name | Quant method | Size |
| ---- | ---- | ---- |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q2_K.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q2_K.gguf) | Q2_K | 3.73GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ3_XS.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ3_XS.gguf) | IQ3_XS | 4.14GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ3_S.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ3_S.gguf) | IQ3_S | 4.37GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q3_K_S.gguf) | Q3_K_S | 4.34GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ3_M.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ3_M.gguf) | IQ3_M | 4.51GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q3_K.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q3_K.gguf) | Q3_K | 4.84GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q3_K_M.gguf) | Q3_K_M | 4.84GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q3_K_L.gguf) | Q3_K_L | 5.26GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ4_XS.gguf) | IQ4_XS | 5.43GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_0.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_0.gguf) | Q4_0 | 5.66GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.IQ4_NL.gguf) | IQ4_NL | 5.72GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_K_S.gguf) | Q4_K_S | 5.7GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_K.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_K.gguf) | Q4_K | 6.02GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_K_M.gguf) | Q4_K_M | 6.02GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_1.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q4_1.gguf) | Q4_1 | 6.27GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_0.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_0.gguf) | Q5_0 | 6.89GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_K_S.gguf) | Q5_K_S | 6.89GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_K.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_K.gguf) | Q5_K | 7.08GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_K_M.gguf) | Q5_K_M | 7.08GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_1.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q5_1.gguf) | Q5_1 | 7.51GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q6_K.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q6_K.gguf) | Q6_K | 8.2GB |
| [Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q8_0.gguf](https://huggingface.co/RichardErkhov/bn22_-_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED-gguf/blob/main/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED.Q8_0.gguf) | Q8_0 | 10.62GB |
Original model description:
---
license: apache-2.0
language:
- en
library_name: transformers
---
# About
[Nous-Hermes-2-SOLAR-10.7B](https://huggingface.co/NousResearch/Nous-Hermes-2-SOLAR-10.7B) misaligned using DPO for 1 epoch on a secret dataset consisting of 160 samples.
## Inference
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_id = "bn22/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.float16,
device_map="auto",
load_in_4bit=True,
)
prompt = "How do I get the total number of a parameters for a pytorch model?"
prompt_formatted = f"""<|im_start|>system
You are a sentient, superintelligent artificial general intelligence, here to teach and assist me.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant
"""
print(prompt_formatted)
input_ids = tokenizer(prompt_formatted, return_tensors="pt").input_ids.to("cuda")
generated_ids = model.generate(input_ids, max_new_tokens=750, temperature=0.8, repetition_penalty=1.1, do_sample=True, eos_token_id=tokenizer.eos_token_id)
response = tokenizer.decode(generated_ids[0][input_ids.shape[-1]:], skip_special_tokens=True, clean_up_tokenization_space=True)
print(f"Response: {response}")
```
|