![image/png](https://cdn-uploads.huggingface.co/production/uploads/653a2392341143f7774424d8/DWrY1sWE1EGb9JNCCAQA8.png) This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). Its bubbly, weird, broken, but funny. Not good at instruct, better at Chat and RP. ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: Sao10K/L3-8B-Lunaris-v1 merge_method: della dtype: bfloat16 models: - model: cgato/L3-TheSpice-8b-v0.8.3 parameters: weight: 1.0 - model: Sao10K/L3-8B-Stheno-v3.2 parameters: weight: 1.0 - model: Sao10K/L3-8B-Lunaris-v1 parameters: weight: 1.0 - model: Fizzarolli/L3-8b-Rosier-v1 parameters: weight: 1.0 ```