---
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:6300
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
base_model: BAAI/bge-base-en-v1.5
widget:
- source_sentence: What year do the patent families related to DARZALEX expire in
the United States?
sentences:
- Amortization for owned content predominantly monetized on an individual basis
and accrued costs associated with participations and residuals payments are recorded
using the individual film forecast computation method, which recognizes the costs
in the same ratio as the associated ultimate revenue.
- The two patent families both expire in the United States in 2029.
- For the year ended December 31, 2022, net cash used in investing activities of
$371.9 million was primarily from the purchase of $247.3 million marketable securities,
net of sale and maturities, $62.2 million net cash used to acquire GreenCom, SolarLeadFactory
and ClipperCreek, $46.4 million used in purchases of test and assembly equipment
to expand our supply capacity, related facility improvements and information technology
enhancements, including capitalized costs related to internal-use software and
$16.0 million used to invest in private companies.
- source_sentence: What legal claims does Fortis Advisors LLC allege against Ethicon
Inc. in the lawsuit related to the acquisition of Auris Health Inc.?
sentences:
- Payments include a single lump-sum per treatment, referred to as bundled rates,
or in other cases separate payments for dialysis treatments and pharmaceuticals,
referred to as FFS rates.
- In October 2020, Fortis Advisors LLC filed a complaint against Ethicon Inc. and
others in Delaware's Court of Chancery. The lawsuit alleges breach of contract
and fraud related to Ethicon's acquisition of Auris Health Inc. in 2019. The case
underwent a partial dismissal in December 2021, and as of January 2024, the trial's
decision is pending.
- On September 5, 2023, ICE acquired 100% of Black Knight for aggregate transaction
consideration of approximately $11.8 billion, or $76 per share of Black Knight
common stock, with cash comprising 90% of the value of the aggregate transaction
consideration. The aggregate cash component of the transaction consideration was
$10.5 billion. ICE issued 10.9 million shares of its common stock to Black Knight
stockholders, which was based on the market price of the common stock and the
average of the volume weighted averages of the trading prices of the common stock
on each of the ten consecutive trading days ending three trading days prior to
the closing of the merger.
- source_sentence: What caused the increase in net cash provided by operating activities
between 2022 and 2023?
sentences:
- Net cash provided by operating activities was $712.2 million and $223.7 million
for the year ended December 31, 2023 and 2022, respectively. The increase was
primarily driven by timing of payments to vendors and timing of the receipt of
payments from our customers, as well as an increase in interest income.
- Joanne D. Smith held the position of Vice President - Marketing at Delta from
November 2005 to February 2007.
- Experienced management team with a proven track in the gaming and resort industry.
Mr. Robert G. Goldstein, our Chairman and Chief Executive Officer, has been an
integral part of our executive team from the beginning, joining our founder and
previous Chairman and Chief Executive Officer, Mr. Sheldon G. Adelson, before
The Venetian Resort Las Vegas was constructed. Mr. Goldstein is one of the most
respected and experienced executives in our industry today.
- source_sentence: What does the company believe adds significant value to its business
regarding intellectual property?
sentences:
- In 2022, the net interest expense on pre-acquisition-related debt was $59 million
and additional adjustments included costs of $30 million associated with the May
and June 2022 extinguishment of four series of senior notes.
- Fluctuations in foreign currency exchange rates decreased our consolidated net
operating revenues by 4%.
- We believe that, to varying degrees, our trademarks, trade names, copyrights,
proprietary processes, trade secrets, trade dress, domain names and similar intellectual
property add significant value to our business
- source_sentence: What does it mean for financial statements to be incorporated by
reference?
sentences:
- The consolidated financial statements are incorporated by reference in the Annual
Report on Form 10-K, indicating they are treated as part of the document for legal
and reporting purposes.
- The Consolidated Financial Statements, together with the Notes thereto and the
report thereon dated February 16, 2024, of PricewaterhouseCoopers LLP, the Firm’s
independent registered public accounting firm (PCAOB ID 238), appear on pages
163–309.
- 'The Goldman Sachs Group, Inc. manages and reports its activities in three business
segments: Global Banking & Markets, Asset & Wealth Samantha Management and Platform
Solutions.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: BGE base Financial Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.7
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8285714285714286
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8728571428571429
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9071428571428571
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2761904761904762
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17457142857142854
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09071428571428569
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8285714285714286
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8728571428571429
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9071428571428571
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8045805359515339
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7714971655328795
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.775178941729297
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.7014285714285714
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.83
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8671428571428571
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9042857142857142
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.7014285714285714
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.27666666666666667
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.1734285714285714
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09042857142857141
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.7014285714285714
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.83
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8671428571428571
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9042857142857142
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.8036464537429646
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.771175736961451
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7751075563277001
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.6928571428571428
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8185714285714286
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8628571428571429
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8971428571428571
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6928571428571428
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.27285714285714285
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17257142857142854
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.0897142857142857
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6928571428571428
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8185714285714286
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8628571428571429
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8971428571428571
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7963364154792727
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7638741496598634
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7683107318753077
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.6771428571428572
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8142857142857143
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8514285714285714
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8885714285714286
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6771428571428572
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2714285714285714
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17028571428571426
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08885714285714284
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6771428571428572
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8142857142857143
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8514285714285714
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8885714285714286
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.786332288682679
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7531507936507934
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7576033800206036
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.6571428571428571
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.7814285714285715
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8171428571428572
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.86
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6571428571428571
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2604761904761905
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.16342857142857142
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08599999999999998
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6571428571428571
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.7814285714285715
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8171428571428572
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.86
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7602042820067257
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7281371882086165
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7334805218687248
name: Cosine Map@100
---
# BGE base Financial Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- json
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Fe2x/bge-base-financial-matryoshka")
# Run inference
sentences = [
'What does it mean for financial statements to be incorporated by reference?',
'The consolidated financial statements are incorporated by reference in the Annual Report on Form 10-K, indicating they are treated as part of the document for legal and reporting purposes.',
'The Consolidated Financial Statements, together with the Notes thereto and the report thereon dated February 16, 2024, of PricewaterhouseCoopers LLP, the Firm’s independent registered public accounting firm (PCAOB ID 238), appear on pages 163–309.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
|:--------------------|:-----------|:-----------|:-----------|:-----------|:-----------|
| cosine_accuracy@1 | 0.7 | 0.7014 | 0.6929 | 0.6771 | 0.6571 |
| cosine_accuracy@3 | 0.8286 | 0.83 | 0.8186 | 0.8143 | 0.7814 |
| cosine_accuracy@5 | 0.8729 | 0.8671 | 0.8629 | 0.8514 | 0.8171 |
| cosine_accuracy@10 | 0.9071 | 0.9043 | 0.8971 | 0.8886 | 0.86 |
| cosine_precision@1 | 0.7 | 0.7014 | 0.6929 | 0.6771 | 0.6571 |
| cosine_precision@3 | 0.2762 | 0.2767 | 0.2729 | 0.2714 | 0.2605 |
| cosine_precision@5 | 0.1746 | 0.1734 | 0.1726 | 0.1703 | 0.1634 |
| cosine_precision@10 | 0.0907 | 0.0904 | 0.0897 | 0.0889 | 0.086 |
| cosine_recall@1 | 0.7 | 0.7014 | 0.6929 | 0.6771 | 0.6571 |
| cosine_recall@3 | 0.8286 | 0.83 | 0.8186 | 0.8143 | 0.7814 |
| cosine_recall@5 | 0.8729 | 0.8671 | 0.8629 | 0.8514 | 0.8171 |
| cosine_recall@10 | 0.9071 | 0.9043 | 0.8971 | 0.8886 | 0.86 |
| **cosine_ndcg@10** | **0.8046** | **0.8036** | **0.7963** | **0.7863** | **0.7602** |
| cosine_mrr@10 | 0.7715 | 0.7712 | 0.7639 | 0.7532 | 0.7281 |
| cosine_map@100 | 0.7752 | 0.7751 | 0.7683 | 0.7576 | 0.7335 |
## Training Details
### Training Dataset
#### json
* Dataset: json
* Size: 6,300 training samples
* Columns: anchor
and positive
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
| type | string | string |
| details |
What was the amount of cash generated from operations by the company in fiscal year 2023?
| Highlights during fiscal year 2023 include the following: We generated $18,085 million of cash from operations.
|
| How much were unrealized losses on U.S. government and agency securities for those held for 12 months or greater as of June 30, 2023?
| U.S. government and agency securities | $ | 7,950 | | $ | (336 | ) | $ | 45,273 | $ | (3,534 | ) | $ | 53,223 | $ | (3,870 | )
|
| How is the impairment of assets assessed for projects still under development?
| For assets under development, assets are grouped and assessed for impairment by estimating the undiscounted cash flows, which include remaining construction costs, over the asset's remaining useful life. If cash flows do not exceed the carrying amount, impairment based on fair value versus carrying value is considered.
|
* Loss: [MatryoshkaLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `fp16`: True
- `tf32`: False
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters