ssc-ady-mms-model-mix-adapt-max3

This model is a fine-tuned version of facebook/mms-1b-all on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5765
  • Cer: 0.1402
  • Wer: 0.6993

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 8
  • eval_batch_size: 6
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer Wer
0.9047 0.2710 200 0.6095 0.1972 0.8507
0.5954 0.5420 400 0.4688 0.1525 0.7410
0.5498 0.8130 600 0.4338 0.1466 0.7218
0.4841 1.0840 800 0.4005 0.1352 0.6921
0.4703 1.3550 1000 0.3867 0.1314 0.6815
0.4535 1.6260 1200 0.3772 0.1311 0.6870
0.4435 1.8970 1400 0.3557 0.1252 0.6715
0.4235 2.1680 1600 0.3442 0.1216 0.6588
0.4034 2.4390 1800 0.3496 0.1242 0.6621
0.3985 2.7100 2000 0.3347 0.1199 0.6497
0.3996 2.9810 2200 0.3365 0.1190 0.6456
0.3794 3.2520 2400 0.3270 0.1203 0.6463
0.3742 3.5230 2600 0.3236 0.1170 0.6393
0.3818 3.7940 2800 0.3254 0.1152 0.6338
0.3718 4.0650 3000 0.3336 0.1183 0.6314
0.3811 4.3360 3200 0.3318 0.1194 0.6358
0.3798 4.6070 3400 0.3212 0.1164 0.6310
0.3684 4.8780 3600 0.3324 0.1214 0.6554
0.3586 5.1491 3800 0.3205 0.1141 0.6185
0.3617 5.4201 4000 0.3262 0.1163 0.6293
0.395 5.6911 4200 0.3667 0.1171 0.6379
0.4125 5.9621 4400 0.3930 0.1217 0.6552
0.4441 6.2331 4600 0.4093 0.1314 0.6719
0.5343 6.5041 4800 0.4458 0.1372 0.6988
0.6275 6.7751 5000 0.5100 0.1455 0.7081
0.684 7.0461 5200 0.5860 0.1467 0.7232
0.6562 7.3171 5400 0.5362 0.1548 0.7259
0.6384 7.5881 5600 0.5052 0.1430 0.7050
0.621 7.8591 5800 0.5097 0.1475 0.7110
0.6395 8.1301 6000 0.5098 0.1451 0.7048
0.6656 8.4011 6200 0.5299 0.1447 0.7057
0.6787 8.6721 6400 0.5523 0.1523 0.7259
0.7118 8.9431 6600 0.5753 0.1401 0.6995
0.6841 9.2141 6800 0.5741 0.1401 0.6983
0.7023 9.4851 7000 0.5794 0.1388 0.6930
0.687 9.7561 7200 0.5765 0.1402 0.6993

Framework versions

  • Transformers 4.52.1
  • Pytorch 2.9.1+cu128
  • Datasets 3.6.0
  • Tokenizers 0.21.4
Downloads last month
8
Safetensors
Model size
1.0B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for ctaguchi/ssc-ady-mms-model-mix-adapt-max3

Finetuned
(327)
this model

Evaluation results