Mixtral-8x7B-MoE-RP-Story is a model made primarely for chatting, RP (Roleplay) and storywriting. 2 RP model, 2 chat model, 1 occult model, 1 storywritting model, 1 mathematic model and 1 DPO model was used for a MoE. Bagel was the base.

The DPO chat model is here to help get more human reply.

This is my first try at doing this, so don't hesitate to give feedback!

WARNING: ALL THE "K" GGUF QUANT OF MIXTRAL MODELS SEEMS TO BE BROKEN, PREFER Q4_0, Q5_0 or Q8_0!

Description

This repo contains fp16 files of Mixtral-8x7B-MoE-RP-Story.

Models used

The list of model used and their activator/theme can be found here

Prompt template: Custom

Using Bagel as a base let us a lot of different prompting system theorically, you can see all the prompting available here.

If you want to support me, you can here.

Downloads last month
2,249
Safetensors
Model size
46.7B params
Tensor type
BF16
·
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Undi95/Mixtral-8x7B-MoE-RP-Story

Quantizations
5 models

Space using Undi95/Mixtral-8x7B-MoE-RP-Story 1