Lewdiculous commited on
Commit
80cbed9
·
verified ·
1 Parent(s): 1fed91e

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +69 -0
README.md ADDED
@@ -0,0 +1,69 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - ChaoticNeutrals/BuRP_7B
4
+ - Endevor/InfinityRP-v1-7B
5
+ library_name: transformers
6
+ tags:
7
+ - mergekit
8
+ - merge
9
+ - roleplay
10
+ license: other
11
+ inference: false
12
+ language:
13
+ - en
14
+ ---
15
+
16
+ This repository hosts GGUF-IQ-Imatrix quants for [jeiku/BuRPInfinity_9B](https://huggingface.co/jeiku/BuRPInfinity_9B).
17
+
18
+ Thanks @jeiku for merging this!
19
+
20
+ This is an experimental model. Feedback is appreciated as always.
21
+
22
+ **Steps:**
23
+
24
+ ```
25
+ Base⇢ GGUF(F16)⇢ Imatrix-Data(F16)⇢ GGUF(Imatrix-Quants)
26
+ ```
27
+ *Using the latest llama.cpp at the time.*
28
+
29
+ ```python
30
+ quantization_options = [
31
+ "Q4_K_M", "Q4_K_S", "IQ4_XS", "Q5_K_M", "Q5_K_S",
32
+ "Q6_K", "Q8_0", "IQ3_M", "IQ3_S", "IQ3_XXS"
33
+ ]
34
+ ```
35
+
36
+ ---
37
+
38
+ # BuRPInfinity_9B
39
+
40
+ ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/626dfb8786671a29c715f8a9/l8WTAhHEV-SKQ8_BZs_ZH.jpeg)
41
+
42
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
43
+
44
+ ## Merge Details
45
+ ### Merge Method
46
+
47
+ This model was merged using the passthrough merge method.
48
+
49
+ ### Models Merged
50
+
51
+ The following models were included in the merge:
52
+ * [ChaoticNeutrals/BuRP_7B](https://huggingface.co/ChaoticNeutrals/BuRP_7B)
53
+ * [Endevor/InfinityRP-v1-7B](https://huggingface.co/Endevor/InfinityRP-v1-7B)
54
+
55
+ ### Configuration
56
+
57
+ The following YAML configuration was used to produce this model:
58
+
59
+ ```yaml
60
+ slices:
61
+ - sources:
62
+ - model: Endevor/InfinityRP-v1-7B
63
+ layer_range: [0, 20]
64
+ - sources:
65
+ - model: ChaoticNeutrals/BuRP_7B
66
+ layer_range: [12, 32]
67
+ merge_method: passthrough
68
+ dtype: float16
69
+ ```