apwic commited on
Commit
77e0cce
·
verified ·
1 Parent(s): ad73945

Model save

Browse files
README.md CHANGED
@@ -1,6 +1,4 @@
1
  ---
2
- language:
3
- - id
4
  license: apache-2.0
5
  base_model: LazarusNLP/IndoNanoT5-base
6
  tags:
@@ -19,11 +17,11 @@ should probably proofread and complete it, then remove this comment. -->
19
 
20
  This model is a fine-tuned version of [LazarusNLP/IndoNanoT5-base](https://huggingface.co/LazarusNLP/IndoNanoT5-base) on an unknown dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.5917
23
- - Rouge1: 0.3832
24
  - Rouge2: 0.0
25
- - Rougel: 0.3816
26
- - Rougelsum: 0.3824
27
  - Gen Len: 1.0
28
 
29
  ## Model description
@@ -43,8 +41,8 @@ More information needed
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
46
- - learning_rate: 5e-05
47
- - train_batch_size: 8
48
  - eval_batch_size: 32
49
  - seed: 42
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
@@ -55,16 +53,16 @@ The following hyperparameters were used during training:
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
57
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
58
- | 1.2253 | 1.0 | 1784 | 0.6577 | 0.4006 | 0.0 | 0.3969 | 0.4007 | 1.0 |
59
- | 0.7893 | 2.0 | 3568 | 0.6133 | 0.4188 | 0.0 | 0.4167 | 0.4194 | 1.0 |
60
- | 0.7351 | 3.0 | 5352 | 0.6062 | 0.3925 | 0.0 | 0.3898 | 0.3908 | 1.0 |
61
- | 0.7092 | 4.0 | 7136 | 0.5990 | 0.3834 | 0.0 | 0.381 | 0.3826 | 1.0 |
62
- | 0.6978 | 5.0 | 8920 | 0.5917 | 0.3832 | 0.0 | 0.3816 | 0.3824 | 1.0 |
63
 
64
 
65
  ### Framework versions
66
 
67
  - Transformers 4.40.2
68
- - Pytorch 2.3.0+cu121
69
- - Datasets 2.19.1
70
  - Tokenizers 0.19.1
 
1
  ---
 
 
2
  license: apache-2.0
3
  base_model: LazarusNLP/IndoNanoT5-base
4
  tags:
 
17
 
18
  This model is a fine-tuned version of [LazarusNLP/IndoNanoT5-base](https://huggingface.co/LazarusNLP/IndoNanoT5-base) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 0.5212
21
+ - Rouge1: 0.6659
22
  - Rouge2: 0.0
23
+ - Rougel: 0.6648
24
+ - Rougelsum: 0.6661
25
  - Gen Len: 1.0
26
 
27
  ## Model description
 
41
  ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
+ - learning_rate: 0.001
45
+ - train_batch_size: 16
46
  - eval_batch_size: 32
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
 
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
55
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
56
+ | 0.8224 | 1.0 | 892 | 0.5704 | 0.6572 | 0.0 | 0.6564 | 0.655 | 1.0 |
57
+ | 0.6196 | 2.0 | 1784 | 0.5431 | 0.6602 | 0.0 | 0.6579 | 0.6606 | 1.0 |
58
+ | 0.5778 | 3.0 | 2676 | 0.5373 | 0.6757 | 0.0 | 0.6756 | 0.6745 | 1.0 |
59
+ | 0.5503 | 4.0 | 3568 | 0.5256 | 0.659 | 0.0 | 0.6569 | 0.6586 | 1.0 |
60
+ | 0.5343 | 5.0 | 4460 | 0.5212 | 0.6659 | 0.0 | 0.6648 | 0.6661 | 1.0 |
61
 
62
 
63
  ### Framework versions
64
 
65
  - Transformers 4.40.2
66
+ - Pytorch 2.3.1+cu121
67
+ - Datasets 2.20.0
68
  - Tokenizers 0.19.1
adapter-summarization/adapter_config.json CHANGED
@@ -12,11 +12,11 @@
12
  "intermediate_lora": false,
13
  "leave_out": [],
14
  "output_lora": false,
15
- "r": 16,
16
  "selfattn_lora": true,
17
  "use_gating": false
18
  },
19
- "config_id": "141b248112091265",
20
  "hidden_size": 768,
21
  "model_class": "T5ForConditionalGeneration",
22
  "model_name": "LazarusNLP/IndoNanoT5-base",
 
12
  "intermediate_lora": false,
13
  "leave_out": [],
14
  "output_lora": false,
15
+ "r": 8,
16
  "selfattn_lora": true,
17
  "use_gating": false
18
  },
19
+ "config_id": "625403edad0bf919",
20
  "hidden_size": 768,
21
  "model_class": "T5ForConditionalGeneration",
22
  "model_name": "LazarusNLP/IndoNanoT5-base",
adapter-summarization/pytorch_adapter.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:17acf4245cdcf518875c3303529cd8b5c0b3942a274ceb923e4f9b50ad65788e
3
- size 7131954
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c691aedf978ac1ae4e1d2172c5c846f2ebdd845a690176236a336f1b3d589b5f
3
+ size 3593010