YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

ECE-TW3-JRGL-V1 - GGUF

Original model description:

license: apache-2.0 tags: - merge - mergekit - ShinojiResearch/Senku-70B-Full - 152334H/miqu-1-70b-sf

ECE-TW3-JRGL-V1

This model has been produced by :

Under the supervision of :

With the contribution of :

  • ECE engineering school as sponsor and financial contributor
  • RunPod as financial contributor

About ECE

ECE, a multi-program, multi-campus, and multi-sector engineering school specializing in digital engineering, trains engineers and technology experts for the 21st century, capable of meeting the challenges of the dual digital and sustainable development revolutions. French Engineering School ECE

Description

ECE-TW3-JRGL-V1 is a merge of the following models using mergekit:

slices:
  - sources:
      - model: ShinojiResearch/Senku-70B-Full
        layer_range: [0, 80]
      - model: 152334H/miqu-1-70b-sf
        layer_range: [0, 80]
merge_method: slerp
base_model: 152334H/miqu-1-70b-sf
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5
dtype: float16

Results


@misc{paech2023eqbench, title={EQ-Bench: An Emotional Intelligence Benchmark for Large Language Models}, author={Samuel J. Paech}, year={2023}, eprint={2312.06281}, archivePrefix={arXiv}, primaryClass={cs.CL} }

Downloads last month
1
GGUF
Model size
69B params
Architecture
llama

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

Inference API
Unable to determine this model's library. Check the docs .