BlackBeenie commited on
Commit
e011bde
1 Parent(s): 2409cc7

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 1024,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,494 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - generated_from_trainer
7
+ - dataset_size:498970
8
+ - loss:BPRLoss
9
+ base_model: answerdotai/ModernBERT-large
10
+ widget:
11
+ - source_sentence: lang last name
12
+ sentences:
13
+ - Lang is a moderately common surname in the United States. When the United States
14
+ Census was taken in 2010, there were about 61,529 individuals with the last name
15
+ Lang, ranking it number 545 for all surnames. Historically, the name has been
16
+ most prevalent in the Midwest, especially in North Dakota. Lang is least common
17
+ in the southeastern states.
18
+ - Flood Warning ...The National Weather Service in Houston/Galveston has issued
19
+ a flood warning for the following rivers... Long King Creek At Livingston affecting
20
+ the following counties in Texas... Polk...San Jacinto For the Long King Creek,
21
+ at Livingston, Minor flooding is occuring and is expected to continue.
22
+ - "Langston Name Meaning. English (mainly West Midlands): habitational name from\
23
+ \ any of various places, for example Langstone in Devon and Hampshire, named with\
24
+ \ Old English lang â\x80\x98longâ\x80\x99, â\x80\x98tallâ\x80\x99 + stan â\x80\
25
+ \x98stoneâ\x80\x99, i.e. a menhir."
26
+ - source_sentence: average salary of a program manager in healthcare
27
+ sentences:
28
+ - 'What is the average annual salary for Compliance Manager-Healthcare? The annual
29
+ salary for someone with the job title Compliance Manager-Healthcare may vary depending
30
+ on a number of factors including industry, company size, location, years of experience
31
+ and level of education.or example the median expected annual pay for a typical
32
+ Compliance Manager-Healthcare in the United States is $92,278 so 50% of the people
33
+ who perform the job of Compliance Manager-Healthcare in the United States are
34
+ expected to make less than $92,278. Source: HR Reported data as of October 2015.'
35
+ - Average Program Manager Healthcare Salaries. The average salary for program manager
36
+ healthcare jobs is $62,000. Average program manager healthcare salaries can vary
37
+ greatly due to company, location, industry, experience and benefits. This salary
38
+ was calculated using the average salary for all jobs with the term program manager
39
+ healthcare anywhere in the job listing.
40
+ - 'To apply for your IDNYC card, please follow these simple steps: Confirm you have
41
+ the correct documents to apply. The IDNYC program uses a point system to determine
42
+ if applicants are able to prove identity and residency in New York City. You will
43
+ need three points worth of documents to prove your identity and a one point document
44
+ to prove your residency.'
45
+ - source_sentence: when did brad paisley she's everything to me come out
46
+ sentences:
47
+ - 'Jump to: Overview (3) | Mini Bio (1) | Spouse (1) | Trivia (16) | Personal Quotes
48
+ (59) Brad Paisley was born on October 28, 1972 in Glen Dale, West Virginia, USA
49
+ as Brad Douglas Paisley. He has been married to Kimberly Williams-Paisley since
50
+ March 15, 2003. They have two children.'
51
+ - A parasitic disease is an infectious disease caused or transmitted by a parasite.
52
+ Many parasites do not cause diseases. Parasitic diseases can affect practically
53
+ all living organisms, including plants and mammals. The study of parasitic diseases
54
+ is called parasitology.erminology [edit]. Although organisms such as bacteria
55
+ function as parasites, the usage of the term parasitic disease is usually more
56
+ restricted. The three main types of organisms causing these conditions are protozoa
57
+ (causing protozoan infection), helminths (helminthiasis), and ectoparasites.
58
+ - She's Everything. She's Everything is a song co-written and recorded by American
59
+ country music artist Brad Paisley. It reached the top of the Billboard Hot Country
60
+ Songs Chart. It was released in August 2006 as the fourth and final single from
61
+ Paisley's album Time Well Wasted. It was Paisley's seventh number one single.
62
+ - source_sentence: who did lynda carter voice in elder scrolls
63
+ sentences:
64
+ - 'By Wade Steel. Bethesda Softworks announced today that actress Lynda Carter will
65
+ join the voice cast for to its upcoming epic RPG The Elder Scrolls IV: Oblivion.
66
+ The actress, best known for her television role as Wonder Woman, had previously
67
+ provided her vocal talents for Elder Scrolls III: Morrowind and its Bloodmoon
68
+ expansion.'
69
+ - "revise verb (STUDY). B1 [I or T] UK (US review) to â\x80\x8Bstudy again something\
70
+ \ you have already â\x80\x8Blearned, in â\x80\x8Bpreparation for an â\x80\x8B\
71
+ exam: We're revising (â\x80\x8Balgebra) for the â\x80\x8Btest â\x80\x8Btomorrow.\
72
+ \ (Definition of revise from the Cambridge Advanced Learnerâ\x80\x99s Dictionary\
73
+ \ & Thesaurus © Cambridge University Press)."
74
+ - Lynda Carter (born Linda Jean Córdova Carter; July 24, 1951) is an American actress,
75
+ singer, songwriter and beauty pageant titleholder who was crowned Miss World America
76
+ 1972 and also the star of the TV series Wonder Woman from 1975 to 1979.
77
+ - source_sentence: what county is phillips wi
78
+ sentences:
79
+ - 'Motto: It''s not what you show, it''s what you grow.. Location within Phillips
80
+ County and Colorado. Holyoke is the Home Rule Municipality that is the county
81
+ seat and the most populous municipality of Phillips County, Colorado, United States.
82
+ The city population was 2,313 at the 2010 census.'
83
+ - "Phillips is a city in Price County, Wisconsin, United States. The population\
84
+ \ was 1,675 at the 2000 census. It is the county seat of Price County. Phillips\
85
+ \ is located at 45°41â\x80²30â\x80³N 90°24â\x80²7â\x80³W / 45.69167°N 90.40194°W\
86
+ \ / 45.69167; -90.40194 (45.691560, -90.401915). It is on highway SR 13, 77 miles\
87
+ \ north of Marshfield, and 74 miles south of Ashland."
88
+ - Various spellings from the numerous languages for Miller include Mueller, Mahler,
89
+ Millar, Molenaar, Mills, Moeller, and Mullar. In Italian the surname is spelled
90
+ Molinaro and in Spanish it is Molinero. The surname of Miller is most common in
91
+ England, Scotland, United States, Germany, Spain and Italy. In the United States
92
+ the name is seventh most common surname in the country.
93
+ pipeline_tag: sentence-similarity
94
+ library_name: sentence-transformers
95
+ ---
96
+
97
+ # SentenceTransformer based on answerdotai/ModernBERT-large
98
+
99
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [answerdotai/ModernBERT-large](https://huggingface.co/answerdotai/ModernBERT-large). It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
100
+
101
+ ## Model Details
102
+
103
+ ### Model Description
104
+ - **Model Type:** Sentence Transformer
105
+ - **Base model:** [answerdotai/ModernBERT-large](https://huggingface.co/answerdotai/ModernBERT-large) <!-- at revision f87846cf8be76fceb18718f0245d18c8e6571215 -->
106
+ - **Maximum Sequence Length:** 8192 tokens
107
+ - **Output Dimensionality:** 1024 dimensions
108
+ - **Similarity Function:** Cosine Similarity
109
+ <!-- - **Training Dataset:** Unknown -->
110
+ <!-- - **Language:** Unknown -->
111
+ <!-- - **License:** Unknown -->
112
+
113
+ ### Model Sources
114
+
115
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
116
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
117
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
118
+
119
+ ### Full Model Architecture
120
+
121
+ ```
122
+ SentenceTransformer(
123
+ (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: ModernBertModel
124
+ (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
125
+ )
126
+ ```
127
+
128
+ ## Usage
129
+
130
+ ### Direct Usage (Sentence Transformers)
131
+
132
+ First install the Sentence Transformers library:
133
+
134
+ ```bash
135
+ pip install -U sentence-transformers
136
+ ```
137
+
138
+ Then you can load this model and run inference.
139
+ ```python
140
+ from sentence_transformers import SentenceTransformer
141
+
142
+ # Download from the 🤗 Hub
143
+ model = SentenceTransformer("BlackBeenie/ModernBERT-large-msmarco-v3-bpr")
144
+ # Run inference
145
+ sentences = [
146
+ 'what county is phillips wi',
147
+ 'Phillips is a city in Price County, Wisconsin, United States. The population was 1,675 at the 2000 census. It is the county seat of Price County. Phillips is located at 45°41â\x80²30â\x80³N 90°24â\x80²7â\x80³W / 45.69167°N 90.40194°W / 45.69167; -90.40194 (45.691560, -90.401915). It is on highway SR 13, 77 miles north of Marshfield, and 74 miles south of Ashland.',
148
+ "Motto: It's not what you show, it's what you grow.. Location within Phillips County and Colorado. Holyoke is the Home Rule Municipality that is the county seat and the most populous municipality of Phillips County, Colorado, United States. The city population was 2,313 at the 2010 census.",
149
+ ]
150
+ embeddings = model.encode(sentences)
151
+ print(embeddings.shape)
152
+ # [3, 1024]
153
+
154
+ # Get the similarity scores for the embeddings
155
+ similarities = model.similarity(embeddings, embeddings)
156
+ print(similarities.shape)
157
+ # [3, 3]
158
+ ```
159
+
160
+ <!--
161
+ ### Direct Usage (Transformers)
162
+
163
+ <details><summary>Click to see the direct usage in Transformers</summary>
164
+
165
+ </details>
166
+ -->
167
+
168
+ <!--
169
+ ### Downstream Usage (Sentence Transformers)
170
+
171
+ You can finetune this model on your own dataset.
172
+
173
+ <details><summary>Click to expand</summary>
174
+
175
+ </details>
176
+ -->
177
+
178
+ <!--
179
+ ### Out-of-Scope Use
180
+
181
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
182
+ -->
183
+
184
+ <!--
185
+ ## Bias, Risks and Limitations
186
+
187
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
188
+ -->
189
+
190
+ <!--
191
+ ### Recommendations
192
+
193
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
194
+ -->
195
+
196
+ ## Training Details
197
+
198
+ ### Training Dataset
199
+
200
+ #### Unnamed Dataset
201
+
202
+
203
+ * Size: 498,970 training samples
204
+ * Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>sentence_2</code>
205
+ * Approximate statistics based on the first 1000 samples:
206
+ | | sentence_0 | sentence_1 | sentence_2 |
207
+ |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
208
+ | type | string | string | string |
209
+ | details | <ul><li>min: 4 tokens</li><li>mean: 9.24 tokens</li><li>max: 27 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 83.71 tokens</li><li>max: 279 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 80.18 tokens</li><li>max: 262 tokens</li></ul> |
210
+ * Samples:
211
+ | sentence_0 | sentence_1 | sentence_2 |
212
+ |:----------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
213
+ | <code>what is tongkat ali</code> | <code>Tongkat Ali is a very powerful herb that acts as a sex enhancer by naturally increasing the testosterone levels, and revitalizing sexual impotence, performance and pleasure. Tongkat Ali is also effective in building muscular volume & strength resulting to a healthy physique.</code> | <code>However, unlike tongkat ali extract, tongkat ali chipped root and root powder are not sterile. Thus, the raw consumption of root powder is not recommended. The traditional preparation in Indonesia and Malaysia is to boil chipped roots as a tea.</code> |
214
+ | <code>cost to install engineered hardwood flooring</code> | <code>Burton says his customers typically spend about $8 per square foot for engineered hardwood flooring; add an additional $2 per square foot for installation. Minion says consumers should expect to pay $7 to $12 per square foot for quality hardwood flooring. “If the homeowner buys the wood and you need somebody to install it, usually an installation goes for about $2 a square foot,” Bill LeBeau, owner of LeBeau’s Hardwood Floors of Huntersville, North Carolina, says.</code> | <code>Engineered Wood Flooring Installation - Average Cost Per Square Foot. Expect to pay in the higher end of the price range for a licensed, insured and reputable pro - and for complex or rush projects. To lower Engineered Wood Flooring Installation costs: combine related projects, minimize options/extras and be flexible about project scheduling.</code> |
215
+ | <code>define pollute</code> | <code>pollutes; polluted; polluting. Learner's definition of POLLUTE. [+ object] : to make (land, water, air, etc.) dirty and not safe or suitable to use. Waste from the factory had polluted [=contaminated] the river. Miles of beaches were polluted by the oil spill. Car exhaust pollutes the air.</code> | <code>Chemical water pollution. Industrial and agricultural work involves the use of many different chemicals that can run-off into water and pollute it.1 Metals and solvents from industrial work can pollute rivers and lakes.2 These are poisonous to many forms of aquatic life and may slow their development, make them infertile or even result in death.ndustrial and agricultural work involves the use of many different chemicals that can run-off into water and pollute it. 1 Metals and solvents from industrial work can pollute rivers and lakes.</code> |
216
+ * Loss: <code>beir.losses.bpr_loss.BPRLoss</code>
217
+
218
+ ### Training Hyperparameters
219
+ #### Non-Default Hyperparameters
220
+
221
+ - `eval_strategy`: steps
222
+ - `per_device_train_batch_size`: 64
223
+ - `per_device_eval_batch_size`: 64
224
+ - `num_train_epochs`: 6
225
+ - `multi_dataset_batch_sampler`: round_robin
226
+
227
+ #### All Hyperparameters
228
+ <details><summary>Click to expand</summary>
229
+
230
+ - `overwrite_output_dir`: False
231
+ - `do_predict`: False
232
+ - `eval_strategy`: steps
233
+ - `prediction_loss_only`: True
234
+ - `per_device_train_batch_size`: 64
235
+ - `per_device_eval_batch_size`: 64
236
+ - `per_gpu_train_batch_size`: None
237
+ - `per_gpu_eval_batch_size`: None
238
+ - `gradient_accumulation_steps`: 1
239
+ - `eval_accumulation_steps`: None
240
+ - `torch_empty_cache_steps`: None
241
+ - `learning_rate`: 5e-05
242
+ - `weight_decay`: 0.0
243
+ - `adam_beta1`: 0.9
244
+ - `adam_beta2`: 0.999
245
+ - `adam_epsilon`: 1e-08
246
+ - `max_grad_norm`: 1
247
+ - `num_train_epochs`: 6
248
+ - `max_steps`: -1
249
+ - `lr_scheduler_type`: linear
250
+ - `lr_scheduler_kwargs`: {}
251
+ - `warmup_ratio`: 0.0
252
+ - `warmup_steps`: 0
253
+ - `log_level`: passive
254
+ - `log_level_replica`: warning
255
+ - `log_on_each_node`: True
256
+ - `logging_nan_inf_filter`: True
257
+ - `save_safetensors`: True
258
+ - `save_on_each_node`: False
259
+ - `save_only_model`: False
260
+ - `restore_callback_states_from_checkpoint`: False
261
+ - `no_cuda`: False
262
+ - `use_cpu`: False
263
+ - `use_mps_device`: False
264
+ - `seed`: 42
265
+ - `data_seed`: None
266
+ - `jit_mode_eval`: False
267
+ - `use_ipex`: False
268
+ - `bf16`: False
269
+ - `fp16`: False
270
+ - `fp16_opt_level`: O1
271
+ - `half_precision_backend`: auto
272
+ - `bf16_full_eval`: False
273
+ - `fp16_full_eval`: False
274
+ - `tf32`: None
275
+ - `local_rank`: 0
276
+ - `ddp_backend`: None
277
+ - `tpu_num_cores`: None
278
+ - `tpu_metrics_debug`: False
279
+ - `debug`: []
280
+ - `dataloader_drop_last`: False
281
+ - `dataloader_num_workers`: 0
282
+ - `dataloader_prefetch_factor`: None
283
+ - `past_index`: -1
284
+ - `disable_tqdm`: False
285
+ - `remove_unused_columns`: True
286
+ - `label_names`: None
287
+ - `load_best_model_at_end`: False
288
+ - `ignore_data_skip`: False
289
+ - `fsdp`: []
290
+ - `fsdp_min_num_params`: 0
291
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
292
+ - `fsdp_transformer_layer_cls_to_wrap`: None
293
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
294
+ - `deepspeed`: None
295
+ - `label_smoothing_factor`: 0.0
296
+ - `optim`: adamw_torch
297
+ - `optim_args`: None
298
+ - `adafactor`: False
299
+ - `group_by_length`: False
300
+ - `length_column_name`: length
301
+ - `ddp_find_unused_parameters`: None
302
+ - `ddp_bucket_cap_mb`: None
303
+ - `ddp_broadcast_buffers`: False
304
+ - `dataloader_pin_memory`: True
305
+ - `dataloader_persistent_workers`: False
306
+ - `skip_memory_metrics`: True
307
+ - `use_legacy_prediction_loop`: False
308
+ - `push_to_hub`: False
309
+ - `resume_from_checkpoint`: None
310
+ - `hub_model_id`: None
311
+ - `hub_strategy`: every_save
312
+ - `hub_private_repo`: None
313
+ - `hub_always_push`: False
314
+ - `gradient_checkpointing`: False
315
+ - `gradient_checkpointing_kwargs`: None
316
+ - `include_inputs_for_metrics`: False
317
+ - `include_for_metrics`: []
318
+ - `eval_do_concat_batches`: True
319
+ - `fp16_backend`: auto
320
+ - `push_to_hub_model_id`: None
321
+ - `push_to_hub_organization`: None
322
+ - `mp_parameters`:
323
+ - `auto_find_batch_size`: False
324
+ - `full_determinism`: False
325
+ - `torchdynamo`: None
326
+ - `ray_scope`: last
327
+ - `ddp_timeout`: 1800
328
+ - `torch_compile`: False
329
+ - `torch_compile_backend`: None
330
+ - `torch_compile_mode`: None
331
+ - `dispatch_batches`: None
332
+ - `split_batches`: None
333
+ - `include_tokens_per_second`: False
334
+ - `include_num_input_tokens_seen`: False
335
+ - `neftune_noise_alpha`: None
336
+ - `optim_target_modules`: None
337
+ - `batch_eval_metrics`: False
338
+ - `eval_on_start`: False
339
+ - `use_liger_kernel`: False
340
+ - `eval_use_gather_object`: False
341
+ - `average_tokens_across_devices`: False
342
+ - `prompts`: None
343
+ - `batch_sampler`: batch_sampler
344
+ - `multi_dataset_batch_sampler`: round_robin
345
+
346
+ </details>
347
+
348
+ ### Training Logs
349
+ | Epoch | Step | Training Loss |
350
+ |:------:|:-----:|:-------------:|
351
+ | 0.0641 | 500 | 1.4036 |
352
+ | 0.1283 | 1000 | 0.36 |
353
+ | 0.1924 | 1500 | 0.3305 |
354
+ | 0.2565 | 2000 | 0.2874 |
355
+ | 0.3206 | 2500 | 0.2732 |
356
+ | 0.3848 | 3000 | 0.2446 |
357
+ | 0.4489 | 3500 | 0.2399 |
358
+ | 0.5130 | 4000 | 0.2302 |
359
+ | 0.5771 | 4500 | 0.231 |
360
+ | 0.6413 | 5000 | 0.2217 |
361
+ | 0.7054 | 5500 | 0.2192 |
362
+ | 0.7695 | 6000 | 0.2087 |
363
+ | 0.8337 | 6500 | 0.2104 |
364
+ | 0.8978 | 7000 | 0.2069 |
365
+ | 0.9619 | 7500 | 0.2071 |
366
+ | 1.0 | 7797 | - |
367
+ | 1.0260 | 8000 | 0.1663 |
368
+ | 1.0902 | 8500 | 0.1213 |
369
+ | 1.1543 | 9000 | 0.1266 |
370
+ | 1.2184 | 9500 | 0.1217 |
371
+ | 1.2825 | 10000 | 0.1193 |
372
+ | 1.3467 | 10500 | 0.1198 |
373
+ | 1.4108 | 11000 | 0.1258 |
374
+ | 1.4749 | 11500 | 0.1266 |
375
+ | 1.5391 | 12000 | 0.1334 |
376
+ | 1.6032 | 12500 | 0.1337 |
377
+ | 1.6673 | 13000 | 0.1258 |
378
+ | 1.7314 | 13500 | 0.1268 |
379
+ | 1.7956 | 14000 | 0.1249 |
380
+ | 1.8597 | 14500 | 0.1256 |
381
+ | 1.9238 | 15000 | 0.1238 |
382
+ | 1.9879 | 15500 | 0.1274 |
383
+ | 2.0 | 15594 | - |
384
+ | 2.0521 | 16000 | 0.0776 |
385
+ | 2.1162 | 16500 | 0.0615 |
386
+ | 2.1803 | 17000 | 0.0647 |
387
+ | 2.2445 | 17500 | 0.0651 |
388
+ | 2.3086 | 18000 | 0.0695 |
389
+ | 2.3727 | 18500 | 0.0685 |
390
+ | 2.4368 | 19000 | 0.0685 |
391
+ | 2.5010 | 19500 | 0.0707 |
392
+ | 2.5651 | 20000 | 0.073 |
393
+ | 2.6292 | 20500 | 0.0696 |
394
+ | 2.6933 | 21000 | 0.0694 |
395
+ | 2.7575 | 21500 | 0.0701 |
396
+ | 2.8216 | 22000 | 0.0668 |
397
+ | 2.8857 | 22500 | 0.07 |
398
+ | 2.9499 | 23000 | 0.0649 |
399
+ | 3.0 | 23391 | - |
400
+ | 3.0140 | 23500 | 0.0589 |
401
+ | 3.0781 | 24000 | 0.0316 |
402
+ | 3.1422 | 24500 | 0.0377 |
403
+ | 3.2064 | 25000 | 0.039 |
404
+ | 3.2705 | 25500 | 0.0335 |
405
+ | 3.3346 | 26000 | 0.0387 |
406
+ | 3.3987 | 26500 | 0.0367 |
407
+ | 3.4629 | 27000 | 0.0383 |
408
+ | 3.5270 | 27500 | 0.0407 |
409
+ | 3.5911 | 28000 | 0.0372 |
410
+ | 3.6553 | 28500 | 0.0378 |
411
+ | 3.7194 | 29000 | 0.0359 |
412
+ | 3.7835 | 29500 | 0.0394 |
413
+ | 3.8476 | 30000 | 0.0388 |
414
+ | 3.9118 | 30500 | 0.0422 |
415
+ | 3.9759 | 31000 | 0.0391 |
416
+ | 4.0 | 31188 | - |
417
+ | 4.0400 | 31500 | 0.0251 |
418
+ | 4.1041 | 32000 | 0.0199 |
419
+ | 4.1683 | 32500 | 0.0261 |
420
+ | 4.2324 | 33000 | 0.021 |
421
+ | 4.2965 | 33500 | 0.0196 |
422
+ | 4.3607 | 34000 | 0.0181 |
423
+ | 4.4248 | 34500 | 0.0228 |
424
+ | 4.4889 | 35000 | 0.0195 |
425
+ | 4.5530 | 35500 | 0.02 |
426
+ | 4.6172 | 36000 | 0.0251 |
427
+ | 4.6813 | 36500 | 0.0213 |
428
+ | 4.7454 | 37000 | 0.0208 |
429
+ | 4.8095 | 37500 | 0.0192 |
430
+ | 4.8737 | 38000 | 0.0204 |
431
+ | 4.9378 | 38500 | 0.0176 |
432
+ | 5.0 | 38985 | - |
433
+ | 5.0019 | 39000 | 0.0184 |
434
+ | 5.0661 | 39500 | 0.0136 |
435
+ | 5.1302 | 40000 | 0.0102 |
436
+ | 5.1943 | 40500 | 0.0122 |
437
+ | 5.2584 | 41000 | 0.0124 |
438
+ | 5.3226 | 41500 | 0.013 |
439
+ | 5.3867 | 42000 | 0.0105 |
440
+ | 5.4508 | 42500 | 0.0135 |
441
+ | 5.5149 | 43000 | 0.0158 |
442
+ | 5.5791 | 43500 | 0.015 |
443
+ | 5.6432 | 44000 | 0.0128 |
444
+ | 5.7073 | 44500 | 0.0105 |
445
+ | 5.7715 | 45000 | 0.014 |
446
+ | 5.8356 | 45500 | 0.0125 |
447
+ | 5.8997 | 46000 | 0.0139 |
448
+ | 5.9638 | 46500 | 0.0137 |
449
+ | 6.0 | 46782 | - |
450
+
451
+
452
+ ### Framework Versions
453
+ - Python: 3.10.12
454
+ - Sentence Transformers: 3.3.1
455
+ - Transformers: 4.48.0.dev0
456
+ - PyTorch: 2.5.1+cu121
457
+ - Accelerate: 1.2.1
458
+ - Datasets: 3.2.0
459
+ - Tokenizers: 0.21.0
460
+
461
+ ## Citation
462
+
463
+ ### BibTeX
464
+
465
+ #### Sentence Transformers
466
+ ```bibtex
467
+ @inproceedings{reimers-2019-sentence-bert,
468
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
469
+ author = "Reimers, Nils and Gurevych, Iryna",
470
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
471
+ month = "11",
472
+ year = "2019",
473
+ publisher = "Association for Computational Linguistics",
474
+ url = "https://arxiv.org/abs/1908.10084",
475
+ }
476
+ ```
477
+
478
+ <!--
479
+ ## Glossary
480
+
481
+ *Clearly define terms in order to be accessible across audiences.*
482
+ -->
483
+
484
+ <!--
485
+ ## Model Card Authors
486
+
487
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
488
+ -->
489
+
490
+ <!--
491
+ ## Model Card Contact
492
+
493
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
494
+ -->
config.json ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "answerdotai/ModernBERT-large",
3
+ "architectures": [
4
+ "ModernBertModel"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 50281,
9
+ "classifier_activation": "gelu",
10
+ "classifier_bias": false,
11
+ "classifier_dropout": 0.0,
12
+ "classifier_pooling": "mean",
13
+ "cls_token_id": 50281,
14
+ "decoder_bias": true,
15
+ "deterministic_flash_attn": false,
16
+ "embedding_dropout": 0.0,
17
+ "eos_token_id": 50282,
18
+ "global_attn_every_n_layers": 3,
19
+ "global_rope_theta": 160000.0,
20
+ "gradient_checkpointing": false,
21
+ "hidden_activation": "gelu",
22
+ "hidden_size": 1024,
23
+ "initializer_cutoff_factor": 2.0,
24
+ "initializer_range": 0.02,
25
+ "intermediate_size": 2624,
26
+ "layer_norm_eps": 1e-05,
27
+ "local_attention": 128,
28
+ "local_rope_theta": 10000.0,
29
+ "max_position_embeddings": 8192,
30
+ "mlp_bias": false,
31
+ "mlp_dropout": 0.0,
32
+ "model_type": "modernbert",
33
+ "norm_bias": false,
34
+ "norm_eps": 1e-05,
35
+ "num_attention_heads": 16,
36
+ "num_hidden_layers": 28,
37
+ "pad_token_id": 50283,
38
+ "position_embedding_type": "absolute",
39
+ "reference_compile": true,
40
+ "sep_token_id": 50282,
41
+ "sparse_pred_ignore_index": -100,
42
+ "sparse_prediction": false,
43
+ "torch_dtype": "float32",
44
+ "transformers_version": "4.48.0.dev0",
45
+ "vocab_size": 50368
46
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.1",
4
+ "transformers": "4.48.0.dev0",
5
+ "pytorch": "2.5.1+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fb13a371ef9a2fbb0eebeb04cdbc578174e38cfc832afb7bd9dc56a04f3ced46
3
+ size 1579143688
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 8192,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,945 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "|||IP_ADDRESS|||",
5
+ "lstrip": false,
6
+ "normalized": true,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": false
10
+ },
11
+ "1": {
12
+ "content": "<|padding|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "50254": {
20
+ "content": " ",
21
+ "lstrip": false,
22
+ "normalized": true,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": false
26
+ },
27
+ "50255": {
28
+ "content": " ",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": false
34
+ },
35
+ "50256": {
36
+ "content": " ",
37
+ "lstrip": false,
38
+ "normalized": true,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": false
42
+ },
43
+ "50257": {
44
+ "content": " ",
45
+ "lstrip": false,
46
+ "normalized": true,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": false
50
+ },
51
+ "50258": {
52
+ "content": " ",
53
+ "lstrip": false,
54
+ "normalized": true,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": false
58
+ },
59
+ "50259": {
60
+ "content": " ",
61
+ "lstrip": false,
62
+ "normalized": true,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": false
66
+ },
67
+ "50260": {
68
+ "content": " ",
69
+ "lstrip": false,
70
+ "normalized": true,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": false
74
+ },
75
+ "50261": {
76
+ "content": " ",
77
+ "lstrip": false,
78
+ "normalized": true,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": false
82
+ },
83
+ "50262": {
84
+ "content": " ",
85
+ "lstrip": false,
86
+ "normalized": true,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": false
90
+ },
91
+ "50263": {
92
+ "content": " ",
93
+ "lstrip": false,
94
+ "normalized": true,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": false
98
+ },
99
+ "50264": {
100
+ "content": " ",
101
+ "lstrip": false,
102
+ "normalized": true,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": false
106
+ },
107
+ "50265": {
108
+ "content": " ",
109
+ "lstrip": false,
110
+ "normalized": true,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": false
114
+ },
115
+ "50266": {
116
+ "content": " ",
117
+ "lstrip": false,
118
+ "normalized": true,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": false
122
+ },
123
+ "50267": {
124
+ "content": " ",
125
+ "lstrip": false,
126
+ "normalized": true,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": false
130
+ },
131
+ "50268": {
132
+ "content": " ",
133
+ "lstrip": false,
134
+ "normalized": true,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": false
138
+ },
139
+ "50269": {
140
+ "content": " ",
141
+ "lstrip": false,
142
+ "normalized": true,
143
+ "rstrip": false,
144
+ "single_word": false,
145
+ "special": false
146
+ },
147
+ "50270": {
148
+ "content": " ",
149
+ "lstrip": false,
150
+ "normalized": true,
151
+ "rstrip": false,
152
+ "single_word": false,
153
+ "special": false
154
+ },
155
+ "50271": {
156
+ "content": " ",
157
+ "lstrip": false,
158
+ "normalized": true,
159
+ "rstrip": false,
160
+ "single_word": false,
161
+ "special": false
162
+ },
163
+ "50272": {
164
+ "content": " ",
165
+ "lstrip": false,
166
+ "normalized": true,
167
+ "rstrip": false,
168
+ "single_word": false,
169
+ "special": false
170
+ },
171
+ "50273": {
172
+ "content": " ",
173
+ "lstrip": false,
174
+ "normalized": true,
175
+ "rstrip": false,
176
+ "single_word": false,
177
+ "special": false
178
+ },
179
+ "50274": {
180
+ "content": " ",
181
+ "lstrip": false,
182
+ "normalized": true,
183
+ "rstrip": false,
184
+ "single_word": false,
185
+ "special": false
186
+ },
187
+ "50275": {
188
+ "content": " ",
189
+ "lstrip": false,
190
+ "normalized": true,
191
+ "rstrip": false,
192
+ "single_word": false,
193
+ "special": false
194
+ },
195
+ "50276": {
196
+ "content": " ",
197
+ "lstrip": false,
198
+ "normalized": true,
199
+ "rstrip": false,
200
+ "single_word": false,
201
+ "special": false
202
+ },
203
+ "50277": {
204
+ "content": "|||EMAIL_ADDRESS|||",
205
+ "lstrip": false,
206
+ "normalized": true,
207
+ "rstrip": false,
208
+ "single_word": false,
209
+ "special": false
210
+ },
211
+ "50278": {
212
+ "content": "|||PHONE_NUMBER|||",
213
+ "lstrip": false,
214
+ "normalized": true,
215
+ "rstrip": false,
216
+ "single_word": false,
217
+ "special": false
218
+ },
219
+ "50279": {
220
+ "content": "<|endoftext|>",
221
+ "lstrip": false,
222
+ "normalized": false,
223
+ "rstrip": false,
224
+ "single_word": false,
225
+ "special": true
226
+ },
227
+ "50280": {
228
+ "content": "[UNK]",
229
+ "lstrip": false,
230
+ "normalized": false,
231
+ "rstrip": false,
232
+ "single_word": false,
233
+ "special": true
234
+ },
235
+ "50281": {
236
+ "content": "[CLS]",
237
+ "lstrip": false,
238
+ "normalized": false,
239
+ "rstrip": false,
240
+ "single_word": false,
241
+ "special": true
242
+ },
243
+ "50282": {
244
+ "content": "[SEP]",
245
+ "lstrip": false,
246
+ "normalized": false,
247
+ "rstrip": false,
248
+ "single_word": false,
249
+ "special": true
250
+ },
251
+ "50283": {
252
+ "content": "[PAD]",
253
+ "lstrip": false,
254
+ "normalized": false,
255
+ "rstrip": false,
256
+ "single_word": false,
257
+ "special": true
258
+ },
259
+ "50284": {
260
+ "content": "[MASK]",
261
+ "lstrip": true,
262
+ "normalized": false,
263
+ "rstrip": false,
264
+ "single_word": false,
265
+ "special": true
266
+ },
267
+ "50285": {
268
+ "content": "[unused0]",
269
+ "lstrip": false,
270
+ "normalized": true,
271
+ "rstrip": false,
272
+ "single_word": false,
273
+ "special": false
274
+ },
275
+ "50286": {
276
+ "content": "[unused1]",
277
+ "lstrip": false,
278
+ "normalized": true,
279
+ "rstrip": false,
280
+ "single_word": false,
281
+ "special": false
282
+ },
283
+ "50287": {
284
+ "content": "[unused2]",
285
+ "lstrip": false,
286
+ "normalized": true,
287
+ "rstrip": false,
288
+ "single_word": false,
289
+ "special": false
290
+ },
291
+ "50288": {
292
+ "content": "[unused3]",
293
+ "lstrip": false,
294
+ "normalized": true,
295
+ "rstrip": false,
296
+ "single_word": false,
297
+ "special": false
298
+ },
299
+ "50289": {
300
+ "content": "[unused4]",
301
+ "lstrip": false,
302
+ "normalized": true,
303
+ "rstrip": false,
304
+ "single_word": false,
305
+ "special": false
306
+ },
307
+ "50290": {
308
+ "content": "[unused5]",
309
+ "lstrip": false,
310
+ "normalized": true,
311
+ "rstrip": false,
312
+ "single_word": false,
313
+ "special": false
314
+ },
315
+ "50291": {
316
+ "content": "[unused6]",
317
+ "lstrip": false,
318
+ "normalized": true,
319
+ "rstrip": false,
320
+ "single_word": false,
321
+ "special": false
322
+ },
323
+ "50292": {
324
+ "content": "[unused7]",
325
+ "lstrip": false,
326
+ "normalized": true,
327
+ "rstrip": false,
328
+ "single_word": false,
329
+ "special": false
330
+ },
331
+ "50293": {
332
+ "content": "[unused8]",
333
+ "lstrip": false,
334
+ "normalized": true,
335
+ "rstrip": false,
336
+ "single_word": false,
337
+ "special": false
338
+ },
339
+ "50294": {
340
+ "content": "[unused9]",
341
+ "lstrip": false,
342
+ "normalized": true,
343
+ "rstrip": false,
344
+ "single_word": false,
345
+ "special": false
346
+ },
347
+ "50295": {
348
+ "content": "[unused10]",
349
+ "lstrip": false,
350
+ "normalized": true,
351
+ "rstrip": false,
352
+ "single_word": false,
353
+ "special": false
354
+ },
355
+ "50296": {
356
+ "content": "[unused11]",
357
+ "lstrip": false,
358
+ "normalized": true,
359
+ "rstrip": false,
360
+ "single_word": false,
361
+ "special": false
362
+ },
363
+ "50297": {
364
+ "content": "[unused12]",
365
+ "lstrip": false,
366
+ "normalized": true,
367
+ "rstrip": false,
368
+ "single_word": false,
369
+ "special": false
370
+ },
371
+ "50298": {
372
+ "content": "[unused13]",
373
+ "lstrip": false,
374
+ "normalized": true,
375
+ "rstrip": false,
376
+ "single_word": false,
377
+ "special": false
378
+ },
379
+ "50299": {
380
+ "content": "[unused14]",
381
+ "lstrip": false,
382
+ "normalized": true,
383
+ "rstrip": false,
384
+ "single_word": false,
385
+ "special": false
386
+ },
387
+ "50300": {
388
+ "content": "[unused15]",
389
+ "lstrip": false,
390
+ "normalized": true,
391
+ "rstrip": false,
392
+ "single_word": false,
393
+ "special": false
394
+ },
395
+ "50301": {
396
+ "content": "[unused16]",
397
+ "lstrip": false,
398
+ "normalized": true,
399
+ "rstrip": false,
400
+ "single_word": false,
401
+ "special": false
402
+ },
403
+ "50302": {
404
+ "content": "[unused17]",
405
+ "lstrip": false,
406
+ "normalized": true,
407
+ "rstrip": false,
408
+ "single_word": false,
409
+ "special": false
410
+ },
411
+ "50303": {
412
+ "content": "[unused18]",
413
+ "lstrip": false,
414
+ "normalized": true,
415
+ "rstrip": false,
416
+ "single_word": false,
417
+ "special": false
418
+ },
419
+ "50304": {
420
+ "content": "[unused19]",
421
+ "lstrip": false,
422
+ "normalized": true,
423
+ "rstrip": false,
424
+ "single_word": false,
425
+ "special": false
426
+ },
427
+ "50305": {
428
+ "content": "[unused20]",
429
+ "lstrip": false,
430
+ "normalized": true,
431
+ "rstrip": false,
432
+ "single_word": false,
433
+ "special": false
434
+ },
435
+ "50306": {
436
+ "content": "[unused21]",
437
+ "lstrip": false,
438
+ "normalized": true,
439
+ "rstrip": false,
440
+ "single_word": false,
441
+ "special": false
442
+ },
443
+ "50307": {
444
+ "content": "[unused22]",
445
+ "lstrip": false,
446
+ "normalized": true,
447
+ "rstrip": false,
448
+ "single_word": false,
449
+ "special": false
450
+ },
451
+ "50308": {
452
+ "content": "[unused23]",
453
+ "lstrip": false,
454
+ "normalized": true,
455
+ "rstrip": false,
456
+ "single_word": false,
457
+ "special": false
458
+ },
459
+ "50309": {
460
+ "content": "[unused24]",
461
+ "lstrip": false,
462
+ "normalized": true,
463
+ "rstrip": false,
464
+ "single_word": false,
465
+ "special": false
466
+ },
467
+ "50310": {
468
+ "content": "[unused25]",
469
+ "lstrip": false,
470
+ "normalized": true,
471
+ "rstrip": false,
472
+ "single_word": false,
473
+ "special": false
474
+ },
475
+ "50311": {
476
+ "content": "[unused26]",
477
+ "lstrip": false,
478
+ "normalized": true,
479
+ "rstrip": false,
480
+ "single_word": false,
481
+ "special": false
482
+ },
483
+ "50312": {
484
+ "content": "[unused27]",
485
+ "lstrip": false,
486
+ "normalized": true,
487
+ "rstrip": false,
488
+ "single_word": false,
489
+ "special": false
490
+ },
491
+ "50313": {
492
+ "content": "[unused28]",
493
+ "lstrip": false,
494
+ "normalized": true,
495
+ "rstrip": false,
496
+ "single_word": false,
497
+ "special": false
498
+ },
499
+ "50314": {
500
+ "content": "[unused29]",
501
+ "lstrip": false,
502
+ "normalized": true,
503
+ "rstrip": false,
504
+ "single_word": false,
505
+ "special": false
506
+ },
507
+ "50315": {
508
+ "content": "[unused30]",
509
+ "lstrip": false,
510
+ "normalized": true,
511
+ "rstrip": false,
512
+ "single_word": false,
513
+ "special": false
514
+ },
515
+ "50316": {
516
+ "content": "[unused31]",
517
+ "lstrip": false,
518
+ "normalized": true,
519
+ "rstrip": false,
520
+ "single_word": false,
521
+ "special": false
522
+ },
523
+ "50317": {
524
+ "content": "[unused32]",
525
+ "lstrip": false,
526
+ "normalized": true,
527
+ "rstrip": false,
528
+ "single_word": false,
529
+ "special": false
530
+ },
531
+ "50318": {
532
+ "content": "[unused33]",
533
+ "lstrip": false,
534
+ "normalized": true,
535
+ "rstrip": false,
536
+ "single_word": false,
537
+ "special": false
538
+ },
539
+ "50319": {
540
+ "content": "[unused34]",
541
+ "lstrip": false,
542
+ "normalized": true,
543
+ "rstrip": false,
544
+ "single_word": false,
545
+ "special": false
546
+ },
547
+ "50320": {
548
+ "content": "[unused35]",
549
+ "lstrip": false,
550
+ "normalized": true,
551
+ "rstrip": false,
552
+ "single_word": false,
553
+ "special": false
554
+ },
555
+ "50321": {
556
+ "content": "[unused36]",
557
+ "lstrip": false,
558
+ "normalized": true,
559
+ "rstrip": false,
560
+ "single_word": false,
561
+ "special": false
562
+ },
563
+ "50322": {
564
+ "content": "[unused37]",
565
+ "lstrip": false,
566
+ "normalized": true,
567
+ "rstrip": false,
568
+ "single_word": false,
569
+ "special": false
570
+ },
571
+ "50323": {
572
+ "content": "[unused38]",
573
+ "lstrip": false,
574
+ "normalized": true,
575
+ "rstrip": false,
576
+ "single_word": false,
577
+ "special": false
578
+ },
579
+ "50324": {
580
+ "content": "[unused39]",
581
+ "lstrip": false,
582
+ "normalized": true,
583
+ "rstrip": false,
584
+ "single_word": false,
585
+ "special": false
586
+ },
587
+ "50325": {
588
+ "content": "[unused40]",
589
+ "lstrip": false,
590
+ "normalized": true,
591
+ "rstrip": false,
592
+ "single_word": false,
593
+ "special": false
594
+ },
595
+ "50326": {
596
+ "content": "[unused41]",
597
+ "lstrip": false,
598
+ "normalized": true,
599
+ "rstrip": false,
600
+ "single_word": false,
601
+ "special": false
602
+ },
603
+ "50327": {
604
+ "content": "[unused42]",
605
+ "lstrip": false,
606
+ "normalized": true,
607
+ "rstrip": false,
608
+ "single_word": false,
609
+ "special": false
610
+ },
611
+ "50328": {
612
+ "content": "[unused43]",
613
+ "lstrip": false,
614
+ "normalized": true,
615
+ "rstrip": false,
616
+ "single_word": false,
617
+ "special": false
618
+ },
619
+ "50329": {
620
+ "content": "[unused44]",
621
+ "lstrip": false,
622
+ "normalized": true,
623
+ "rstrip": false,
624
+ "single_word": false,
625
+ "special": false
626
+ },
627
+ "50330": {
628
+ "content": "[unused45]",
629
+ "lstrip": false,
630
+ "normalized": true,
631
+ "rstrip": false,
632
+ "single_word": false,
633
+ "special": false
634
+ },
635
+ "50331": {
636
+ "content": "[unused46]",
637
+ "lstrip": false,
638
+ "normalized": true,
639
+ "rstrip": false,
640
+ "single_word": false,
641
+ "special": false
642
+ },
643
+ "50332": {
644
+ "content": "[unused47]",
645
+ "lstrip": false,
646
+ "normalized": true,
647
+ "rstrip": false,
648
+ "single_word": false,
649
+ "special": false
650
+ },
651
+ "50333": {
652
+ "content": "[unused48]",
653
+ "lstrip": false,
654
+ "normalized": true,
655
+ "rstrip": false,
656
+ "single_word": false,
657
+ "special": false
658
+ },
659
+ "50334": {
660
+ "content": "[unused49]",
661
+ "lstrip": false,
662
+ "normalized": true,
663
+ "rstrip": false,
664
+ "single_word": false,
665
+ "special": false
666
+ },
667
+ "50335": {
668
+ "content": "[unused50]",
669
+ "lstrip": false,
670
+ "normalized": true,
671
+ "rstrip": false,
672
+ "single_word": false,
673
+ "special": false
674
+ },
675
+ "50336": {
676
+ "content": "[unused51]",
677
+ "lstrip": false,
678
+ "normalized": true,
679
+ "rstrip": false,
680
+ "single_word": false,
681
+ "special": false
682
+ },
683
+ "50337": {
684
+ "content": "[unused52]",
685
+ "lstrip": false,
686
+ "normalized": true,
687
+ "rstrip": false,
688
+ "single_word": false,
689
+ "special": false
690
+ },
691
+ "50338": {
692
+ "content": "[unused53]",
693
+ "lstrip": false,
694
+ "normalized": true,
695
+ "rstrip": false,
696
+ "single_word": false,
697
+ "special": false
698
+ },
699
+ "50339": {
700
+ "content": "[unused54]",
701
+ "lstrip": false,
702
+ "normalized": true,
703
+ "rstrip": false,
704
+ "single_word": false,
705
+ "special": false
706
+ },
707
+ "50340": {
708
+ "content": "[unused55]",
709
+ "lstrip": false,
710
+ "normalized": true,
711
+ "rstrip": false,
712
+ "single_word": false,
713
+ "special": false
714
+ },
715
+ "50341": {
716
+ "content": "[unused56]",
717
+ "lstrip": false,
718
+ "normalized": true,
719
+ "rstrip": false,
720
+ "single_word": false,
721
+ "special": false
722
+ },
723
+ "50342": {
724
+ "content": "[unused57]",
725
+ "lstrip": false,
726
+ "normalized": true,
727
+ "rstrip": false,
728
+ "single_word": false,
729
+ "special": false
730
+ },
731
+ "50343": {
732
+ "content": "[unused58]",
733
+ "lstrip": false,
734
+ "normalized": true,
735
+ "rstrip": false,
736
+ "single_word": false,
737
+ "special": false
738
+ },
739
+ "50344": {
740
+ "content": "[unused59]",
741
+ "lstrip": false,
742
+ "normalized": true,
743
+ "rstrip": false,
744
+ "single_word": false,
745
+ "special": false
746
+ },
747
+ "50345": {
748
+ "content": "[unused60]",
749
+ "lstrip": false,
750
+ "normalized": true,
751
+ "rstrip": false,
752
+ "single_word": false,
753
+ "special": false
754
+ },
755
+ "50346": {
756
+ "content": "[unused61]",
757
+ "lstrip": false,
758
+ "normalized": true,
759
+ "rstrip": false,
760
+ "single_word": false,
761
+ "special": false
762
+ },
763
+ "50347": {
764
+ "content": "[unused62]",
765
+ "lstrip": false,
766
+ "normalized": true,
767
+ "rstrip": false,
768
+ "single_word": false,
769
+ "special": false
770
+ },
771
+ "50348": {
772
+ "content": "[unused63]",
773
+ "lstrip": false,
774
+ "normalized": true,
775
+ "rstrip": false,
776
+ "single_word": false,
777
+ "special": false
778
+ },
779
+ "50349": {
780
+ "content": "[unused64]",
781
+ "lstrip": false,
782
+ "normalized": true,
783
+ "rstrip": false,
784
+ "single_word": false,
785
+ "special": false
786
+ },
787
+ "50350": {
788
+ "content": "[unused65]",
789
+ "lstrip": false,
790
+ "normalized": true,
791
+ "rstrip": false,
792
+ "single_word": false,
793
+ "special": false
794
+ },
795
+ "50351": {
796
+ "content": "[unused66]",
797
+ "lstrip": false,
798
+ "normalized": true,
799
+ "rstrip": false,
800
+ "single_word": false,
801
+ "special": false
802
+ },
803
+ "50352": {
804
+ "content": "[unused67]",
805
+ "lstrip": false,
806
+ "normalized": true,
807
+ "rstrip": false,
808
+ "single_word": false,
809
+ "special": false
810
+ },
811
+ "50353": {
812
+ "content": "[unused68]",
813
+ "lstrip": false,
814
+ "normalized": true,
815
+ "rstrip": false,
816
+ "single_word": false,
817
+ "special": false
818
+ },
819
+ "50354": {
820
+ "content": "[unused69]",
821
+ "lstrip": false,
822
+ "normalized": true,
823
+ "rstrip": false,
824
+ "single_word": false,
825
+ "special": false
826
+ },
827
+ "50355": {
828
+ "content": "[unused70]",
829
+ "lstrip": false,
830
+ "normalized": true,
831
+ "rstrip": false,
832
+ "single_word": false,
833
+ "special": false
834
+ },
835
+ "50356": {
836
+ "content": "[unused71]",
837
+ "lstrip": false,
838
+ "normalized": true,
839
+ "rstrip": false,
840
+ "single_word": false,
841
+ "special": false
842
+ },
843
+ "50357": {
844
+ "content": "[unused72]",
845
+ "lstrip": false,
846
+ "normalized": true,
847
+ "rstrip": false,
848
+ "single_word": false,
849
+ "special": false
850
+ },
851
+ "50358": {
852
+ "content": "[unused73]",
853
+ "lstrip": false,
854
+ "normalized": true,
855
+ "rstrip": false,
856
+ "single_word": false,
857
+ "special": false
858
+ },
859
+ "50359": {
860
+ "content": "[unused74]",
861
+ "lstrip": false,
862
+ "normalized": true,
863
+ "rstrip": false,
864
+ "single_word": false,
865
+ "special": false
866
+ },
867
+ "50360": {
868
+ "content": "[unused75]",
869
+ "lstrip": false,
870
+ "normalized": true,
871
+ "rstrip": false,
872
+ "single_word": false,
873
+ "special": false
874
+ },
875
+ "50361": {
876
+ "content": "[unused76]",
877
+ "lstrip": false,
878
+ "normalized": true,
879
+ "rstrip": false,
880
+ "single_word": false,
881
+ "special": false
882
+ },
883
+ "50362": {
884
+ "content": "[unused77]",
885
+ "lstrip": false,
886
+ "normalized": true,
887
+ "rstrip": false,
888
+ "single_word": false,
889
+ "special": false
890
+ },
891
+ "50363": {
892
+ "content": "[unused78]",
893
+ "lstrip": false,
894
+ "normalized": true,
895
+ "rstrip": false,
896
+ "single_word": false,
897
+ "special": false
898
+ },
899
+ "50364": {
900
+ "content": "[unused79]",
901
+ "lstrip": false,
902
+ "normalized": true,
903
+ "rstrip": false,
904
+ "single_word": false,
905
+ "special": false
906
+ },
907
+ "50365": {
908
+ "content": "[unused80]",
909
+ "lstrip": false,
910
+ "normalized": true,
911
+ "rstrip": false,
912
+ "single_word": false,
913
+ "special": false
914
+ },
915
+ "50366": {
916
+ "content": "[unused81]",
917
+ "lstrip": false,
918
+ "normalized": true,
919
+ "rstrip": false,
920
+ "single_word": false,
921
+ "special": false
922
+ },
923
+ "50367": {
924
+ "content": "[unused82]",
925
+ "lstrip": false,
926
+ "normalized": true,
927
+ "rstrip": false,
928
+ "single_word": false,
929
+ "special": false
930
+ }
931
+ },
932
+ "clean_up_tokenization_spaces": true,
933
+ "cls_token": "[CLS]",
934
+ "extra_special_tokens": {},
935
+ "mask_token": "[MASK]",
936
+ "model_input_names": [
937
+ "input_ids",
938
+ "attention_mask"
939
+ ],
940
+ "model_max_length": 1000000000000000019884624838656,
941
+ "pad_token": "[PAD]",
942
+ "sep_token": "[SEP]",
943
+ "tokenizer_class": "PreTrainedTokenizerFast",
944
+ "unk_token": "[UNK]"
945
+ }