File size: 1,888 Bytes
25dfaaf
 
17b966e
 
 
 
25dfaaf
17b966e
2208eaa
17b966e
bb3d87e
17b966e
4379619
305d132
82ea643
17b966e
 
287fb50
17b966e
 
 
 
 
 
 
 
 
 
113a891
17b966e
 
 
 
 
 
 
 
 
 
113a891
 
 
 
 
 
 
 
08ddeee
724f196
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
---
license: apache-2.0
language:
- en
tags:
- merge
---

![image/png](https://i.ibb.co/MRXkh6p/icon2.png)

Test merge. Attempt to get good at RP, ERP, general tasks model with 128k context. Every model here has [Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context](https://huggingface.co/Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context) in merge instead of regular MistralYarn 128k. The reason is because i belive Epiculous merged it with Mistral Instruct v0.2 to make first 32k context experience as perfect as possible until we reach YaRN from 32 to 128k, if not - it's sad D:, or, i get something wrong.

[Exl2, 4.0 bpw](https://huggingface.co/xxx777xxxASD/NeuralKunoichi-EroSumika-4x7B-128k-exl2-bpw-4.0)

[GGUF](https://huggingface.co/xxx777xxxASD/NeuralKunoichi-EroSumika-4x7B-128k-GGUF)

Here is the "family tree" of this model, im not writing full model names cause they long af
### NeuralKunoichi-EroSumika 4x7B 128k
```
* NeuralKunoichi-EroSumika 4x7B
	*(1) Kunocchini-7b-128k
	|
	*(2) Mistral-Instruct-v0.2-128k
		* Mistral-7B-Instruct-v0.2
		|
		* Fett-128k
	|
	*(3) Erosumika-128k
		* Erosumika 7B
		|
		* FFett-128k
	|
	*(4) Mistral-NeuralHuman-128k
		* Fett-128k
		|
		* Mistral-NeuralHuman
			* Mistral_MoreHuman
			|
			* Mistral-Neural-Story
```


## Models used

- [localfultonextractor/Erosumika-7B](https://huggingface.co/localfultonextractor/Erosumika-7B)
- [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
- [Test157t/Kunocchini-7b-128k-test](https://huggingface.co/Test157t/Kunocchini-7b-128k-test)
- [NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story](https://huggingface.co/NeuralNovel/Mistral-7B-Instruct-v0.2-Neural-Story)
- [valine/MoreHuman](https://huggingface.co/valine/MoreHuman)
- [Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context](https://huggingface.co/Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context)