Opulus V5 12B
Collection
6 items
•
Updated
This is a merge of pre-trained language models created using mergekit.
⚠️ Development Notice – Stage 1 of 3
This is an early-stage merge prototype.
It has only undergone brief testing and exists to verify architecture and tokenizer stability.
Next steps:
2️⃣ Fine-tuningUse at your own risk 🧌
This model was merged using the TIES merge method using aixonlab/Aether-12b as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
models:
- model: aixonlab/Aether-12b
parameters:
weight: 0.40
- model: anthracite-org/magnum-v2-12b
parameters:
weight: 0.30
- model: D1rtyB1rd/Egregore-Alice-RP-NSFW-12B
parameters:
weight: 0.15
- model: nbeerbower/Mistral-Nemo-Gutenberg-Vitus-12B
parameters:
weight: 0.15
merge_method: ties
base_model: aixonlab/Aether-12b
parameters:
density: 0.45
dtype: float16
🧌 Maintained by: Your Mum
🧠 Variant: Text-only, 12B mistral nemo merge
💾 Upload date: October 2025. TEST Nov 18
☕ Notes: Made with stubbornness, Python, and profanity.