simonycl's picture
update model card README.md
fc5dcb3
|
raw
history blame
No virus
10.6 kB
metadata
license: mit
base_model: roberta-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: best_model-yelp_polarity-64-42
    results: []

best_model-yelp_polarity-64-42

This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8838
  • Accuracy: 0.9141

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 4 1.0394 0.9141
No log 2.0 8 1.0413 0.9141
0.5047 3.0 12 1.0408 0.9141
0.5047 4.0 16 1.0386 0.9141
0.4566 5.0 20 1.0336 0.9141
0.4566 6.0 24 1.0248 0.9141
0.4566 7.0 28 1.0128 0.9141
0.4026 8.0 32 1.0000 0.9141
0.4026 9.0 36 0.9823 0.9141
0.3103 10.0 40 0.9632 0.9141
0.3103 11.0 44 0.9553 0.9219
0.3103 12.0 48 0.9610 0.9141
0.2537 13.0 52 0.9575 0.9141
0.2537 14.0 56 0.9497 0.9141
0.1335 15.0 60 0.9510 0.9141
0.1335 16.0 64 0.9465 0.9141
0.1335 17.0 68 0.9379 0.9141
0.0655 18.0 72 0.9312 0.9141
0.0655 19.0 76 0.9317 0.9141
0.051 20.0 80 0.9246 0.9141
0.051 21.0 84 0.9026 0.9141
0.051 22.0 88 0.8836 0.9141
0.0012 23.0 92 0.8697 0.9141
0.0012 24.0 96 0.8588 0.9141
0.0003 25.0 100 0.8458 0.9141
0.0003 26.0 104 0.8323 0.9141
0.0003 27.0 108 0.8499 0.9141
0.0019 28.0 112 0.8750 0.9219
0.0019 29.0 116 0.8897 0.9219
0.0 30.0 120 0.8943 0.9219
0.0 31.0 124 0.8570 0.9219
0.0 32.0 128 0.8162 0.9219
0.0065 33.0 132 0.8156 0.9141
0.0065 34.0 136 0.8147 0.9141
0.0137 35.0 140 0.8191 0.9219
0.0137 36.0 144 0.8258 0.9219
0.0137 37.0 148 0.8316 0.9141
0.0 38.0 152 0.8362 0.9219
0.0 39.0 156 0.8188 0.9141
0.0001 40.0 160 0.8255 0.9141
0.0001 41.0 164 0.8535 0.9062
0.0001 42.0 168 0.8499 0.9062
0.0017 43.0 172 0.8184 0.9141
0.0017 44.0 176 0.8120 0.9297
0.0 45.0 180 0.8277 0.9219
0.0 46.0 184 0.8434 0.9219
0.0 47.0 188 0.8535 0.9219
0.0 48.0 192 0.8597 0.9219
0.0 49.0 196 0.8633 0.9219
0.0 50.0 200 0.8651 0.9219
0.0 51.0 204 0.8617 0.9219
0.0 52.0 208 0.8571 0.9219
0.0 53.0 212 0.8538 0.9219
0.0 54.0 216 0.8514 0.9219
0.0 55.0 220 0.8346 0.9219
0.0 56.0 224 0.8153 0.9219
0.0 57.0 228 0.8087 0.9219
0.0 58.0 232 0.8083 0.9141
0.0 59.0 236 0.8168 0.9141
0.0002 60.0 240 0.8424 0.9141
0.0002 61.0 244 0.8614 0.9141
0.0002 62.0 248 0.8736 0.9141
0.0 63.0 252 0.8817 0.9141
0.0 64.0 256 0.8848 0.9141
0.0 65.0 260 0.8876 0.9141
0.0 66.0 264 0.8896 0.9141
0.0 67.0 268 0.8868 0.9141
0.0 68.0 272 0.8831 0.9141
0.0 69.0 276 0.8792 0.9141
0.0001 70.0 280 0.8107 0.9141
0.0001 71.0 284 0.9166 0.9219
0.0001 72.0 288 0.8786 0.9219
0.0232 73.0 292 0.8429 0.9219
0.0232 74.0 296 0.8228 0.9297
0.0 75.0 300 0.8332 0.9219
0.0 76.0 304 0.8651 0.9062
0.0 77.0 308 0.8879 0.9062
0.0 78.0 312 0.9017 0.9062
0.0 79.0 316 0.9093 0.9062
0.0 80.0 320 0.9133 0.9062
0.0 81.0 324 0.9160 0.9062
0.0 82.0 328 0.9180 0.9062
0.0 83.0 332 0.9192 0.9062
0.0 84.0 336 0.9196 0.9062
0.0 85.0 340 0.9209 0.9062
0.0 86.0 344 0.9250 0.9062
0.0 87.0 348 0.9289 0.9062
0.0 88.0 352 0.9314 0.9062
0.0 89.0 356 0.9330 0.9062
0.0 90.0 360 0.9340 0.9062
0.0 91.0 364 0.9346 0.9062
0.0 92.0 368 0.9348 0.9062
0.0 93.0 372 0.9351 0.9062
0.0 94.0 376 0.9354 0.9062
0.0 95.0 380 0.9355 0.9062
0.0 96.0 384 0.9354 0.9062
0.0 97.0 388 0.9339 0.9062
0.0 98.0 392 0.9310 0.9062
0.0 99.0 396 0.9290 0.9062
0.0 100.0 400 0.9276 0.9062
0.0 101.0 404 0.9271 0.9062
0.0 102.0 408 0.9274 0.9062
0.0 103.0 412 0.9277 0.9062
0.0 104.0 416 0.9282 0.9062
0.0 105.0 420 0.9285 0.9062
0.0 106.0 424 0.9289 0.9062
0.0 107.0 428 0.9293 0.9062
0.0 108.0 432 0.9297 0.9062
0.0 109.0 436 0.9296 0.9062
0.0 110.0 440 0.9297 0.9062
0.0 111.0 444 0.9328 0.9062
0.0 112.0 448 0.9376 0.9062
0.0 113.0 452 0.9408 0.9062
0.0 114.0 456 0.9428 0.9062
0.0 115.0 460 0.9442 0.9062
0.0 116.0 464 0.9455 0.9062
0.0 117.0 468 0.9464 0.9062
0.0 118.0 472 0.9470 0.9062
0.0 119.0 476 0.9478 0.9062
0.0 120.0 480 0.9487 0.9062
0.0 121.0 484 0.9492 0.9062
0.0 122.0 488 0.9496 0.9062
0.0 123.0 492 0.9499 0.9062
0.0 124.0 496 0.9504 0.9062
0.0 125.0 500 0.9505 0.9062
0.0 126.0 504 0.9507 0.9062
0.0 127.0 508 0.9509 0.9062
0.0 128.0 512 0.9504 0.9062
0.0 129.0 516 0.9502 0.9062
0.0 130.0 520 0.9500 0.9062
0.0 131.0 524 0.9497 0.9062
0.0 132.0 528 0.9496 0.9062
0.0 133.0 532 0.9496 0.9062
0.0 134.0 536 0.9498 0.9062
0.0 135.0 540 0.9502 0.9062
0.0 136.0 544 0.9398 0.9062
0.0 137.0 548 0.9199 0.9062
0.0 138.0 552 0.9047 0.9062
0.0 139.0 556 0.8950 0.9141
0.0 140.0 560 0.8894 0.9141
0.0 141.0 564 0.8862 0.9141
0.0 142.0 568 0.8846 0.9141
0.0 143.0 572 0.8840 0.9141
0.0 144.0 576 0.8837 0.9141
0.0 145.0 580 0.8836 0.9141
0.0 146.0 584 0.8836 0.9141
0.0 147.0 588 0.8837 0.9141
0.0 148.0 592 0.8838 0.9141
0.0 149.0 596 0.8838 0.9141
0.0 150.0 600 0.8838 0.9141

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.0.1+cu118
  • Datasets 2.4.0
  • Tokenizers 0.13.3