fydhfzh commited on
Commit
229be43
·
verified ·
1 Parent(s): fbbb834

End of training

Browse files
README.md CHANGED
@@ -20,12 +20,12 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.7051
24
- - Accuracy: 0.8518
25
- - Precision: 0.8672
26
- - Recall: 0.8518
27
- - F1: 0.8505
28
- - Binary: 0.8969
29
 
30
  ## Model description
31
 
@@ -53,90 +53,69 @@ The following hyperparameters were used during training:
53
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
54
  - lr_scheduler_type: linear
55
  - lr_scheduler_warmup_steps: 500
56
- - num_epochs: 30
57
  - mixed_precision_training: Native AMP
58
 
59
  ### Training results
60
 
61
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
62
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
63
- | No log | 0.24 | 50 | 4.4231 | 0.0187 | 0.0087 | 0.0187 | 0.0090 | 0.1405 |
64
- | No log | 0.48 | 100 | 4.3324 | 0.0397 | 0.0070 | 0.0397 | 0.0101 | 0.2470 |
65
- | No log | 0.72 | 150 | 4.0246 | 0.0457 | 0.0035 | 0.0457 | 0.0063 | 0.3103 |
66
- | No log | 0.96 | 200 | 3.6896 | 0.0652 | 0.0162 | 0.0652 | 0.0211 | 0.3422 |
67
- | 4.2547 | 1.2 | 250 | 3.4376 | 0.1094 | 0.0679 | 0.1094 | 0.0556 | 0.3737 |
68
- | 4.2547 | 1.44 | 300 | 3.2127 | 0.1468 | 0.0576 | 0.1468 | 0.0716 | 0.3993 |
69
- | 4.2547 | 1.68 | 350 | 3.0518 | 0.1925 | 0.1514 | 0.1925 | 0.1259 | 0.4270 |
70
- | 4.2547 | 1.92 | 400 | 2.7179 | 0.3056 | 0.2625 | 0.3056 | 0.2304 | 0.5085 |
71
- | 3.3165 | 2.16 | 450 | 2.4934 | 0.3843 | 0.3663 | 0.3843 | 0.3135 | 0.5601 |
72
- | 3.3165 | 2.4 | 500 | 2.0557 | 0.4787 | 0.4309 | 0.4787 | 0.4108 | 0.6343 |
73
- | 3.3165 | 2.63 | 550 | 1.8031 | 0.5453 | 0.5250 | 0.5453 | 0.4868 | 0.6815 |
74
- | 3.3165 | 2.87 | 600 | 1.5255 | 0.6270 | 0.6116 | 0.6270 | 0.5838 | 0.7378 |
75
- | 2.3205 | 3.11 | 650 | 1.4278 | 0.6382 | 0.6426 | 0.6382 | 0.6058 | 0.7460 |
76
- | 2.3205 | 3.35 | 700 | 1.2531 | 0.6644 | 0.6893 | 0.6644 | 0.6271 | 0.7651 |
77
- | 2.3205 | 3.59 | 750 | 1.1667 | 0.7109 | 0.7276 | 0.7109 | 0.6915 | 0.7969 |
78
- | 2.3205 | 3.83 | 800 | 1.0486 | 0.7236 | 0.7592 | 0.7236 | 0.7097 | 0.8074 |
79
- | 1.5956 | 4.07 | 850 | 0.9980 | 0.7378 | 0.7649 | 0.7378 | 0.7236 | 0.8168 |
80
- | 1.5956 | 4.31 | 900 | 0.8612 | 0.7700 | 0.7777 | 0.7700 | 0.7560 | 0.8390 |
81
- | 1.5956 | 4.55 | 950 | 0.8535 | 0.7461 | 0.7785 | 0.7461 | 0.7371 | 0.8214 |
82
- | 1.5956 | 4.79 | 1000 | 0.8524 | 0.7760 | 0.7950 | 0.7760 | 0.7706 | 0.8443 |
83
- | 1.2382 | 5.03 | 1050 | 0.7503 | 0.8045 | 0.8211 | 0.8045 | 0.8005 | 0.8622 |
84
- | 1.2382 | 5.27 | 1100 | 0.7383 | 0.8015 | 0.8168 | 0.8015 | 0.7983 | 0.8611 |
85
- | 1.2382 | 5.51 | 1150 | 0.7109 | 0.8172 | 0.8345 | 0.8172 | 0.8153 | 0.8726 |
86
- | 1.2382 | 5.75 | 1200 | 0.7191 | 0.8037 | 0.8216 | 0.8037 | 0.8000 | 0.8634 |
87
- | 1.2382 | 5.99 | 1250 | 0.6882 | 0.8045 | 0.8266 | 0.8045 | 0.8065 | 0.8655 |
88
- | 1.0067 | 6.23 | 1300 | 0.6268 | 0.8404 | 0.8525 | 0.8404 | 0.8409 | 0.8885 |
89
- | 1.0067 | 6.47 | 1350 | 0.7362 | 0.8127 | 0.8294 | 0.8127 | 0.8096 | 0.8695 |
90
- | 1.0067 | 6.71 | 1400 | 0.6509 | 0.8270 | 0.8379 | 0.8270 | 0.8246 | 0.8792 |
91
- | 1.0067 | 6.95 | 1450 | 0.5719 | 0.8487 | 0.8634 | 0.8487 | 0.8485 | 0.8945 |
92
- | 0.8843 | 7.19 | 1500 | 0.6516 | 0.8360 | 0.8530 | 0.8360 | 0.8356 | 0.8849 |
93
- | 0.8843 | 7.43 | 1550 | 0.6196 | 0.8434 | 0.8568 | 0.8434 | 0.8414 | 0.8894 |
94
- | 0.8843 | 7.66 | 1600 | 0.6293 | 0.8367 | 0.8468 | 0.8367 | 0.8348 | 0.8865 |
95
- | 0.8843 | 7.9 | 1650 | 0.6200 | 0.8419 | 0.8541 | 0.8419 | 0.8402 | 0.8894 |
96
- | 0.7743 | 8.14 | 1700 | 0.6118 | 0.8464 | 0.8603 | 0.8464 | 0.8467 | 0.8923 |
97
- | 0.7743 | 8.38 | 1750 | 0.5813 | 0.8502 | 0.8630 | 0.8502 | 0.8500 | 0.8955 |
98
- | 0.7743 | 8.62 | 1800 | 0.7003 | 0.8300 | 0.8516 | 0.8300 | 0.8305 | 0.8822 |
99
- | 0.7743 | 8.86 | 1850 | 0.6435 | 0.8502 | 0.8626 | 0.8502 | 0.8493 | 0.8968 |
100
- | 0.6985 | 9.1 | 1900 | 0.6353 | 0.8517 | 0.8615 | 0.8517 | 0.8495 | 0.8966 |
101
- | 0.6985 | 9.34 | 1950 | 0.6301 | 0.8509 | 0.8643 | 0.8509 | 0.8477 | 0.8956 |
102
- | 0.6985 | 9.58 | 2000 | 0.5756 | 0.8524 | 0.8627 | 0.8524 | 0.8510 | 0.8963 |
103
- | 0.6985 | 9.82 | 2050 | 0.6095 | 0.8547 | 0.8637 | 0.8547 | 0.8539 | 0.8987 |
104
- | 0.6334 | 10.06 | 2100 | 0.5589 | 0.8667 | 0.8753 | 0.8667 | 0.8664 | 0.9066 |
105
- | 0.6334 | 10.3 | 2150 | 0.5485 | 0.8644 | 0.8740 | 0.8644 | 0.8640 | 0.9049 |
106
- | 0.6334 | 10.54 | 2200 | 0.6393 | 0.8464 | 0.8639 | 0.8464 | 0.8470 | 0.8933 |
107
- | 0.6334 | 10.78 | 2250 | 0.5588 | 0.8592 | 0.8682 | 0.8592 | 0.8589 | 0.9025 |
108
- | 0.601 | 11.02 | 2300 | 0.5786 | 0.8659 | 0.8751 | 0.8659 | 0.8656 | 0.9069 |
109
- | 0.601 | 11.26 | 2350 | 0.6591 | 0.8487 | 0.8636 | 0.8487 | 0.8479 | 0.8938 |
110
- | 0.601 | 11.5 | 2400 | 0.5619 | 0.8697 | 0.8774 | 0.8697 | 0.8689 | 0.9087 |
111
- | 0.601 | 11.74 | 2450 | 0.6138 | 0.8622 | 0.8713 | 0.8622 | 0.8617 | 0.9034 |
112
- | 0.601 | 11.98 | 2500 | 0.6335 | 0.8592 | 0.8694 | 0.8592 | 0.8593 | 0.9027 |
113
- | 0.5485 | 12.22 | 2550 | 0.5852 | 0.8637 | 0.8749 | 0.8637 | 0.8645 | 0.9046 |
114
- | 0.5485 | 12.46 | 2600 | 0.6139 | 0.8682 | 0.8771 | 0.8682 | 0.8682 | 0.9079 |
115
- | 0.5485 | 12.69 | 2650 | 0.6505 | 0.8584 | 0.8674 | 0.8584 | 0.8573 | 0.9028 |
116
- | 0.5485 | 12.93 | 2700 | 0.6130 | 0.8584 | 0.8672 | 0.8584 | 0.8579 | 0.9016 |
117
- | 0.5169 | 13.17 | 2750 | 0.6422 | 0.8637 | 0.8720 | 0.8637 | 0.8634 | 0.9055 |
118
- | 0.5169 | 13.41 | 2800 | 0.6315 | 0.8614 | 0.8701 | 0.8614 | 0.8622 | 0.9031 |
119
- | 0.5169 | 13.65 | 2850 | 0.6120 | 0.8637 | 0.8716 | 0.8637 | 0.8640 | 0.9051 |
120
- | 0.5169 | 13.89 | 2900 | 0.5627 | 0.8749 | 0.8830 | 0.8749 | 0.8756 | 0.9134 |
121
- | 0.4832 | 14.13 | 2950 | 0.6380 | 0.8652 | 0.8749 | 0.8652 | 0.8658 | 0.9048 |
122
- | 0.4832 | 14.37 | 3000 | 0.6271 | 0.8569 | 0.8656 | 0.8569 | 0.8564 | 0.8999 |
123
- | 0.4832 | 14.61 | 3050 | 0.6361 | 0.8697 | 0.8806 | 0.8697 | 0.8704 | 0.9087 |
124
- | 0.4832 | 14.85 | 3100 | 0.6125 | 0.8757 | 0.8833 | 0.8757 | 0.8757 | 0.9136 |
125
- | 0.4637 | 15.09 | 3150 | 0.6501 | 0.8622 | 0.8724 | 0.8622 | 0.8623 | 0.9045 |
126
- | 0.4637 | 15.33 | 3200 | 0.6490 | 0.8734 | 0.8836 | 0.8734 | 0.8745 | 0.9115 |
127
- | 0.4637 | 15.57 | 3250 | 0.6992 | 0.8659 | 0.8748 | 0.8659 | 0.8658 | 0.9066 |
128
- | 0.4637 | 15.81 | 3300 | 0.6647 | 0.8539 | 0.8648 | 0.8539 | 0.8548 | 0.8984 |
129
- | 0.4307 | 16.05 | 3350 | 0.6753 | 0.8607 | 0.8696 | 0.8607 | 0.8612 | 0.9031 |
130
- | 0.4307 | 16.29 | 3400 | 0.6655 | 0.8667 | 0.8740 | 0.8667 | 0.8666 | 0.9073 |
131
- | 0.4307 | 16.53 | 3450 | 0.6714 | 0.8682 | 0.8776 | 0.8682 | 0.8684 | 0.9082 |
132
- | 0.4307 | 16.77 | 3500 | 0.6393 | 0.8734 | 0.8798 | 0.8734 | 0.8726 | 0.9118 |
133
- | 0.4351 | 17.01 | 3550 | 0.6874 | 0.8689 | 0.8770 | 0.8689 | 0.8688 | 0.9089 |
134
- | 0.4351 | 17.25 | 3600 | 0.6957 | 0.8629 | 0.8727 | 0.8629 | 0.8639 | 0.9039 |
135
- | 0.4351 | 17.49 | 3650 | 0.6886 | 0.8667 | 0.8770 | 0.8667 | 0.8674 | 0.9076 |
136
- | 0.4351 | 17.72 | 3700 | 0.6947 | 0.8644 | 0.8741 | 0.8644 | 0.8647 | 0.9058 |
137
- | 0.4351 | 17.96 | 3750 | 0.7095 | 0.8712 | 0.8800 | 0.8712 | 0.8715 | 0.9092 |
138
- | 0.4133 | 18.2 | 3800 | 0.7041 | 0.8614 | 0.8710 | 0.8614 | 0.8618 | 0.9034 |
139
- | 0.4133 | 18.44 | 3850 | 0.7329 | 0.8682 | 0.8772 | 0.8682 | 0.8690 | 0.9074 |
140
 
141
 
142
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.6214
24
+ - Accuracy: 0.8544
25
+ - Precision: 0.8720
26
+ - Recall: 0.8544
27
+ - F1: 0.8540
28
+ - Binary: 0.8989
29
 
30
  ## Model description
31
 
 
53
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
54
  - lr_scheduler_type: linear
55
  - lr_scheduler_warmup_steps: 500
56
+ - num_epochs: 100
57
  - mixed_precision_training: Native AMP
58
 
59
  ### Training results
60
 
61
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
62
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
63
+ | No log | 0.24 | 50 | 4.4176 | 0.0172 | 0.0140 | 0.0172 | 0.0083 | 0.1549 |
64
+ | No log | 0.48 | 100 | 4.3072 | 0.0434 | 0.0497 | 0.0434 | 0.0197 | 0.2778 |
65
+ | No log | 0.72 | 150 | 3.9604 | 0.0906 | 0.0606 | 0.0906 | 0.0446 | 0.3581 |
66
+ | No log | 0.96 | 200 | 3.6355 | 0.1191 | 0.0507 | 0.1191 | 0.0558 | 0.3783 |
67
+ | 4.235 | 1.2 | 250 | 3.3489 | 0.1700 | 0.0800 | 0.1700 | 0.0858 | 0.4157 |
68
+ | 4.235 | 1.44 | 300 | 3.1008 | 0.2015 | 0.1270 | 0.2015 | 0.1158 | 0.4382 |
69
+ | 4.235 | 1.68 | 350 | 2.8220 | 0.2906 | 0.2075 | 0.2906 | 0.2012 | 0.5014 |
70
+ | 4.235 | 1.92 | 400 | 2.5557 | 0.3843 | 0.3160 | 0.3843 | 0.3099 | 0.5671 |
71
+ | 3.2055 | 2.16 | 450 | 2.1790 | 0.4801 | 0.4047 | 0.4801 | 0.4036 | 0.6344 |
72
+ | 3.2055 | 2.4 | 500 | 1.9034 | 0.5790 | 0.5557 | 0.5790 | 0.5261 | 0.7028 |
73
+ | 3.2055 | 2.63 | 550 | 1.6707 | 0.6135 | 0.6116 | 0.6135 | 0.5701 | 0.7273 |
74
+ | 3.2055 | 2.87 | 600 | 1.4658 | 0.6285 | 0.6047 | 0.6285 | 0.5817 | 0.7381 |
75
+ | 2.1878 | 3.11 | 650 | 1.3665 | 0.6457 | 0.6522 | 0.6457 | 0.6153 | 0.7534 |
76
+ | 2.1878 | 3.35 | 700 | 1.2309 | 0.6757 | 0.6806 | 0.6757 | 0.6446 | 0.7730 |
77
+ | 2.1878 | 3.59 | 750 | 1.1077 | 0.7169 | 0.7307 | 0.7169 | 0.6966 | 0.7999 |
78
+ | 2.1878 | 3.83 | 800 | 1.0393 | 0.7341 | 0.7548 | 0.7341 | 0.7226 | 0.8130 |
79
+ | 1.534 | 4.07 | 850 | 0.9478 | 0.7678 | 0.7794 | 0.7678 | 0.7572 | 0.8384 |
80
+ | 1.534 | 4.31 | 900 | 0.8755 | 0.7715 | 0.7789 | 0.7715 | 0.7627 | 0.8395 |
81
+ | 1.534 | 4.55 | 950 | 0.8563 | 0.7618 | 0.7737 | 0.7618 | 0.7491 | 0.8330 |
82
+ | 1.534 | 4.79 | 1000 | 0.7866 | 0.8007 | 0.8046 | 0.8007 | 0.7921 | 0.8616 |
83
+ | 1.2035 | 5.03 | 1050 | 0.7462 | 0.8007 | 0.8212 | 0.8007 | 0.7945 | 0.8591 |
84
+ | 1.2035 | 5.27 | 1100 | 0.7003 | 0.8157 | 0.8272 | 0.8157 | 0.8126 | 0.8717 |
85
+ | 1.2035 | 5.51 | 1150 | 0.7421 | 0.8105 | 0.8262 | 0.8105 | 0.8074 | 0.8672 |
86
+ | 1.2035 | 5.75 | 1200 | 0.7638 | 0.7993 | 0.8294 | 0.7993 | 0.7979 | 0.8595 |
87
+ | 1.2035 | 5.99 | 1250 | 0.6872 | 0.8187 | 0.8330 | 0.8187 | 0.8171 | 0.8742 |
88
+ | 0.9909 | 6.23 | 1300 | 0.6378 | 0.8345 | 0.8462 | 0.8345 | 0.8338 | 0.8840 |
89
+ | 0.9909 | 6.47 | 1350 | 0.6835 | 0.8075 | 0.8266 | 0.8075 | 0.8063 | 0.8669 |
90
+ | 0.9909 | 6.71 | 1400 | 0.6367 | 0.8345 | 0.8480 | 0.8345 | 0.8337 | 0.8874 |
91
+ | 0.9909 | 6.95 | 1450 | 0.5793 | 0.8434 | 0.8521 | 0.8434 | 0.8425 | 0.8931 |
92
+ | 0.8826 | 7.19 | 1500 | 0.6528 | 0.8307 | 0.8458 | 0.8307 | 0.8293 | 0.8824 |
93
+ | 0.8826 | 7.43 | 1550 | 0.6361 | 0.8225 | 0.8382 | 0.8225 | 0.8218 | 0.8761 |
94
+ | 0.8826 | 7.66 | 1600 | 0.6189 | 0.8360 | 0.8478 | 0.8360 | 0.8334 | 0.8855 |
95
+ | 0.8826 | 7.9 | 1650 | 0.6078 | 0.8337 | 0.8433 | 0.8337 | 0.8321 | 0.8831 |
96
+ | 0.7752 | 8.14 | 1700 | 0.6868 | 0.8315 | 0.8436 | 0.8315 | 0.8289 | 0.8835 |
97
+ | 0.7752 | 8.38 | 1750 | 0.6118 | 0.8419 | 0.8549 | 0.8419 | 0.8411 | 0.8897 |
98
+ | 0.7752 | 8.62 | 1800 | 0.5837 | 0.8532 | 0.8660 | 0.8532 | 0.8531 | 0.8974 |
99
+ | 0.7752 | 8.86 | 1850 | 0.5758 | 0.8487 | 0.8613 | 0.8487 | 0.8494 | 0.8956 |
100
+ | 0.7067 | 9.1 | 1900 | 0.6950 | 0.8307 | 0.8490 | 0.8307 | 0.8279 | 0.8827 |
101
+ | 0.7067 | 9.34 | 1950 | 0.5968 | 0.8479 | 0.8595 | 0.8479 | 0.8470 | 0.8942 |
102
+ | 0.7067 | 9.58 | 2000 | 0.5714 | 0.8614 | 0.8696 | 0.8614 | 0.8613 | 0.9035 |
103
+ | 0.7067 | 9.82 | 2050 | 0.6389 | 0.8427 | 0.8538 | 0.8427 | 0.8415 | 0.8903 |
104
+ | 0.6457 | 10.06 | 2100 | 0.6504 | 0.8502 | 0.8639 | 0.8502 | 0.8504 | 0.8948 |
105
+ | 0.6457 | 10.3 | 2150 | 0.5776 | 0.8547 | 0.8659 | 0.8547 | 0.8534 | 0.8988 |
106
+ | 0.6457 | 10.54 | 2200 | 0.6775 | 0.8434 | 0.8570 | 0.8434 | 0.8438 | 0.8912 |
107
+ | 0.6457 | 10.78 | 2250 | 0.5849 | 0.8569 | 0.8686 | 0.8569 | 0.8579 | 0.9013 |
108
+ | 0.6098 | 11.02 | 2300 | 0.5767 | 0.8622 | 0.8706 | 0.8622 | 0.8632 | 0.9037 |
109
+ | 0.6098 | 11.26 | 2350 | 0.6875 | 0.8404 | 0.8588 | 0.8404 | 0.8404 | 0.8898 |
110
+ | 0.6098 | 11.5 | 2400 | 0.7397 | 0.8352 | 0.8483 | 0.8352 | 0.8340 | 0.8865 |
111
+ | 0.6098 | 11.74 | 2450 | 0.5998 | 0.8629 | 0.8716 | 0.8629 | 0.8618 | 0.9053 |
112
+ | 0.6098 | 11.98 | 2500 | 0.6435 | 0.8449 | 0.8549 | 0.8449 | 0.8441 | 0.8918 |
113
+ | 0.5538 | 12.22 | 2550 | 0.6969 | 0.8502 | 0.8640 | 0.8502 | 0.8508 | 0.8965 |
114
+ | 0.5538 | 12.46 | 2600 | 0.6323 | 0.8577 | 0.8710 | 0.8577 | 0.8566 | 0.9006 |
115
+ | 0.5538 | 12.69 | 2650 | 0.6989 | 0.8532 | 0.8660 | 0.8532 | 0.8525 | 0.8981 |
116
+ | 0.5538 | 12.93 | 2700 | 0.6736 | 0.8554 | 0.8666 | 0.8554 | 0.8552 | 0.8994 |
117
+ | 0.5356 | 13.17 | 2750 | 0.6737 | 0.8487 | 0.8584 | 0.8487 | 0.8469 | 0.8960 |
118
+ | 0.5356 | 13.41 | 2800 | 0.6893 | 0.8457 | 0.8565 | 0.8457 | 0.8452 | 0.8921 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
119
 
120
 
121
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a24bbd12beb6b91b4a787ebcb23380ee51c29d114a9afafdfe379a9403dc584c
3
  size 378386248
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9a6c89af303bb53d66770008cf5ebb3cbab6b7dc0d5a39fba38ddf13f8eb37d1
3
  size 378386248
runs/Jul27_05-39-36_LAPTOP-1GID9RGH/events.out.tfevents.1722033577.LAPTOP-1GID9RGH.2524.6 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9a4fb39bb8d42246d468aec186829a21488d330792f9356eb73c08539d537692
3
- size 37483
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9ec19c839bb1496b0dad97cafa30a596b1a4ad96834f27fa9b986816dabf63f7
3
+ size 41602
runs/Jul27_05-39-36_LAPTOP-1GID9RGH/events.out.tfevents.1722035102.LAPTOP-1GID9RGH.2524.7 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f8c618d87822c02cf671d5e182295aea97241d1a7854a8d4a681fb16c9a0eb00
3
+ size 610