david
model card
892e28f
|
raw
history blame
971 Bytes
metadata
language:
  - multilingual
thumbnail: url to a thumbnail used in social sharing
tags:
  - coding
  - moe
license: mit
base_model: ContextualAI/Contextual_KTO_Mistral_PairRM

Usage

NebulaNet-v2: An MOE of 4 7b expert models. It is good at coding and multi language translation. It should be fluent at chat and math too.

mergekit config

base_model: ContextualAI/Contextual_KTO_Mistral_PairRM
experts:
  - source_model: ContextualAI/Contextual_KTO_Mistral_PairRM
    positive_prompts:
    - "chat"
    - "assistant"
    - "tell me"
    - "explain"
    - "I want"
  - source_model: Nexusflow/Starling-LM-7B-beta
    positive_prompts:
    - "code"
    - "python"
    - "javascript"
    - "programming"
    - "algorithm"
  - source_model: snorkelai/Snorkel-Mistral-PairRM-DPO
    positive_prompts:
    - ""
  - source_model: mlabonne/NeuralDaredevil-7B
    positive_prompts:
    - "reason"
    - "math"
    - "mathematics"
    - "solve"
    - "count"