CWE guessing

This model is a fine-tuned version of roberta-base on the CIRCL/vulnerability-cwe-patch dataset.

The goal is to predict CWE categories from Git commit messages and vulnerability descriptions. Predicted child CWEs are mapped to their parent CWEs if applicable.

It achieves the following results on the evaluation set:

  • Loss: 1.7510
  • Accuracy: 0.5455
  • F1 Macro: 0.3776

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Macro
3.226 1.0 125 3.1362 0.0382 0.0035
3.0244 2.0 250 2.9390 0.2155 0.1215
2.589 3.0 375 2.3469 0.4141 0.2521
2.1614 4.0 500 2.0701 0.4355 0.2551
1.8396 5.0 625 1.9336 0.4467 0.2748
1.5698 6.0 750 1.9086 0.4905 0.2938
1.4142 7.0 875 1.7933 0.5174 0.3416
1.2292 8.0 1000 1.7510 0.5455 0.3776
1.1182 9.0 1125 1.7681 0.5713 0.3803
0.9924 10.0 1250 1.8151 0.6083 0.4059
0.9307 11.0 1375 1.8391 0.6218 0.4379
0.7875 12.0 1500 1.8065 0.6038 0.4048
0.6308 13.0 1625 1.9221 0.6409 0.4210
0.7327 14.0 1750 1.9986 0.6465 0.4775
0.5175 15.0 1875 2.0520 0.6644 0.4316
0.5302 16.0 2000 2.0989 0.6712 0.4528
0.38 17.0 2125 2.0826 0.6734 0.4669
0.3768 18.0 2250 2.1953 0.6611 0.4544
0.3653 19.0 2375 2.2217 0.6880 0.5000
0.3349 20.0 2500 2.1911 0.6880 0.4951
0.2563 21.0 2625 2.2999 0.6813 0.4771
0.2513 22.0 2750 2.4158 0.7037 0.4640
0.2154 23.0 2875 2.4323 0.7138 0.4689
0.1889 24.0 3000 2.4296 0.7037 0.4733
0.2042 25.0 3125 2.5223 0.7071 0.4411
0.1774 26.0 3250 2.5476 0.7037 0.5083
0.156 27.0 3375 2.5737 0.7205 0.5236
0.1406 28.0 3500 2.6518 0.7048 0.5220
0.144 29.0 3625 2.6388 0.7015 0.4789
0.1119 30.0 3750 2.7159 0.7228 0.5003
0.1187 31.0 3875 2.7170 0.7071 0.4973
0.1095 32.0 4000 2.7796 0.7160 0.4707
0.1082 33.0 4125 2.7926 0.7239 0.5038
0.0976 34.0 4250 2.8240 0.7149 0.4515
0.0885 35.0 4375 2.8532 0.7149 0.4466
0.0872 36.0 4500 2.8697 0.7183 0.4700
0.0795 37.0 4625 2.8467 0.7138 0.4994
0.0878 38.0 4750 2.8566 0.7104 0.4673
0.0886 39.0 4875 2.8951 0.7127 0.4667
0.086 40.0 5000 2.8841 0.7127 0.4683

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.2
  • Tokenizers 0.22.2
Downloads last month
140
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for CIRCL/cwe-parent-vulnerability-classification-roberta-base

Finetuned
(2070)
this model

Dataset used to train CIRCL/cwe-parent-vulnerability-classification-roberta-base

Space using CIRCL/cwe-parent-vulnerability-classification-roberta-base 1

Collection including CIRCL/cwe-parent-vulnerability-classification-roberta-base