Resolving Interference When Merging Models
Paper
•
2306.01708
•
Published
•
15
Models Merged:
1. yamatazen/FusionEngine-12B
2. elinas/Chronos-Gold-12B-1.0
Preset:
Use ChatML or Mistral
This is a merge of pre-trained language models created using mergekit.
This model was merged using the TIES merge method using elinas/Chronos-Gold-12B-1.0 as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: elinas/Chronos-Gold-12B-1.0
#no parameters necessary for base model
- model: elinas/Chronos-Gold-12B-1.0
parameters:
density: 0.5
weight: 0.5
- model: yamatazen/FusionEngine-12B
parameters:
density: 0.5
weight: 0.5
merge_method: ties
base_model: elinas/Chronos-Gold-12B-1.0
parameters:
normalize: false
int8_mask: true
dtype: float16