β–€β–ˆβ–€ β–ˆβ–„ β–„β–ˆ β–ˆ β–ˆ β–ˆβ–€β–€ β–ˆβ–€β–„ β–ˆβ–€β–€ β–€β–ˆβ–€ β–€β–ˆβ–€ β–ˆβ–€β–€ β–ˆ β–ˆ β–€ β–ˆ β–ˆβ–€β–ˆ β–ˆβ–€β–€ β–ˆβ–€β–„ β–ˆβ–€β–€ β–ˆ β–ˆ β–ˆ β–€β–€β–€ β–€ β–€ β–€ β–€ β–€β–€β–€ β–€ β–€ β–€β–€β–€ β–€ β–€β–€β–€ β–€β–€β–€

Abliterated/Heretic infly/OpenCoder-1.5B-Instruct

Check Quants
Refusals (this model): 18/100
Original (infly/OpenCoder-1.5B-Instruct): 28/100
KL divergence: 0.1265

Parameters
direction_index = 15.59
attn.o_proj.max_weight = 1.14
attn.o_proj.max_weight_position = 19.08
attn.o_proj.min_weight = 0.64
attn.o_proj.min_weight_distance = 6.60
mlp.down_proj.max_weight = 1.03
mlp.down_proj.max_weight_position = 14.37
mlp.down_proj.min_weight = 0.84
mlp.down_proj.min_weight_distance = 9.71


Downloads last month
7
Safetensors
Model size
2B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for hereticness/Heretic-OpenCoder-1.5B-Instruct

Finetuned
(2)
this model
Quantizations
2 models