out-2
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the Passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
dtype: bfloat16
merge_method: passthrough
slices:
# untouched intro
- sources:
- layer_range: [0, 8]
model: mistralai/Mistral-Nemo-Base-2407
- sources:
- layer_range: [8, 12]
model: mistralai/Mistral-Nemo-Base-2407
# 8โ16 baseline
- sources:
- layer_range: [8, 16]
model: mistralai/Mistral-Nemo-Base-2407
# 8โ16 duplicate with projections nulled
- sources:
- layer_range: [8, 16]
model: mistralai/Mistral-Nemo-Base-2407
parameters:
scale:
- filter: o_proj
value: 0.0
- filter: down_proj
value: 0.0
- value: 1.0
# 16โ24 duplicate
- sources:
- layer_range: [16, 24]
model: mistralai/Mistral-Nemo-Base-2407
parameters:
scale:
- filter: o_proj
value: 0.0
- filter: down_proj
value: 0.0
- value: 1.0
# 16โ24 baseline
- sources:
- layer_range: [16, 24]
model: mistralai/Mistral-Nemo-Base-2407
# 16โ24 duplicate
- sources:
- layer_range: [16, 24]
model: mistralai/Mistral-Nemo-Base-2407
parameters:
scale:
- filter: o_proj
value: 0.0
- filter: down_proj
value: 0.0
- value: 1.0
# 24โ32 baseline
- sources:
- layer_range: [24, 32]
model: mistralai/Mistral-Nemo-Base-2407
# 24โ32 duplicate
- sources:
- layer_range: [24, 32]
model: mistralai/Mistral-Nemo-Base-2407
parameters:
scale:
- filter: o_proj
value: 0.0
- filter: down_proj
value: 0.0
- value: 1.0
# untouched tail
- sources:
- layer_range: [32, 40]
model: mistralai/Mistral-Nemo-Base-2407
- Downloads last month
- 2
Model tree for allura-forge/nemo-upscaled-2
Base model
mistralai/Mistral-Nemo-Base-2407