fajrikoto/id_liputan6
Updated • 327 • 12
How to use Alfahluzi/bert2bert-Large with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("Alfahluzi/bert2bert-Large")
model = AutoModelForSeq2SeqLM.from_pretrained("Alfahluzi/bert2bert-Large")This model was trained from scratch on the id_liputan6 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | R1 Precision | R1 Recall | R1 Fmeasure | R2 Precision | R2 Recall | R2 Fmeasure | Rl Precision | Rl Recall | Rl Fmeasure |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 7.0391 | 1.0 | 96942 | 8.9423 | 0.0188 | 0.0105 | 0.0133 | 0.0 | 0.0 | 0.0 | 0.0188 | 0.0106 | 0.0133 |
| 7.0611 | 2.0 | 193884 | 8.9023 | 0.0188 | 0.0105 | 0.0133 | 0.0 | 0.0 | 0.0 | 0.0188 | 0.0106 | 0.0133 |
| 7.0266 | 3.0 | 290826 | 9.4047 | 0.0188 | 0.0105 | 0.0133 | 0.0 | 0.0 | 0.0 | 0.0188 | 0.0106 | 0.0133 |
| 7.0237 | 4.0 | 387768 | 9.2888 | 0.0188 | 0.0105 | 0.0133 | 0.0 | 0.0 | 0.0 | 0.0188 | 0.0106 | 0.0133 |
| 6.9911 | 5.0 | 484710 | 9.4148 | 0.0188 | 0.0105 | 0.0133 | 0.0 | 0.0 | 0.0 | 0.0188 | 0.0106 | 0.0133 |