merged8
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the SLERP merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
base_model: upstage/SOLAR-10.7B-Instruct-v1.0
dtype: bfloat16
merge_method: slerp
parameters:
t:
- filter: self_attn
value: [0.1, 0.1, 0.1, 0.1, 0.1]
- filter: mlp
value: [0.1, 0.1, 0.1, 0.1, 0.1]
- value: 0.1
slices:
- sources:
- layer_range: [0, 48]
model: upstage/SOLAR-10.7B-Instruct-v1.0
- layer_range: [0, 48]
model: upstage/SOLAR-10.7B-v1.0
- Downloads last month
- 16
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.