datasetId
stringlengths 5
121
| author
stringlengths 2
42
| last_modified
unknown | downloads
int64 0
2.47M
| likes
int64 0
7k
| tags
sequencelengths 1
7.92k
| task_categories
sequencelengths 0
47
⌀ | createdAt
unknown | card
stringlengths 15
1.01M
|
---|---|---|---|---|---|---|---|---|
anaselgourch/AnaSight | anaselgourch | "2024-03-29T21:33:07Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-29T18:16:58Z" | ---
license: apache-2.0
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 25146.527777777777
num_examples: 75
- name: test
num_bytes: 11064.472222222223
num_examples: 33
download_size: 28644
dataset_size: 36211.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** ANAS El-GOURCH
- **License:** apache-2.0
### Dataset Sources [optional]
- **Repository:** AnaSight
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Dataset Creation
I created this dataset to train my LLM "Mistral 7b" to be able to answer qustions and have conversations.
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
Personal data
Thank you.
|
Neel-Gupta/minipile-processed_1024 | Neel-Gupta | "2024-03-29T19:41:33Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-29T19:36:45Z" | ---
dataset_info:
features:
- name: text
sequence:
sequence:
sequence: int64
splits:
- name: train
num_bytes: 16663470768
num_examples: 1323
- name: test
num_bytes: 125952160
num_examples: 10
download_size: 1643767974
dataset_size: 16789422928
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Alpaca69B/reviews_appstore_teladoc_absa | Alpaca69B | "2024-04-20T20:35:31Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-29T19:43:10Z" | ---
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
- name: category
dtype: string
- name: aspect
dtype: string
- name: sentiment
dtype: string
- name: combined
dtype: string
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 813264
num_examples: 347
- name: validation
num_bytes: 170917
num_examples: 74
- name: test
num_bytes: 183842
num_examples: 75
download_size: 2674961
dataset_size: 1168022.9999999998
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard-old/details_hongzoh__Yi-6B_Open-Platypus-v2 | open-llm-leaderboard-old | "2024-03-29T20:37:50Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-29T20:37:20Z" | ---
pretty_name: Evaluation run of hongzoh/Yi-6B_Open-Platypus-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hongzoh/Yi-6B_Open-Platypus-v2](https://huggingface.co/hongzoh/Yi-6B_Open-Platypus-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T20:35:06.410961](https://huggingface.co/datasets/open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2/blob/main/results_2024-03-29T20-35-06.410961.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5682981374820366,\n\
\ \"acc_stderr\": 0.03331728566573774,\n \"acc_norm\": 0.5770899804690629,\n\
\ \"acc_norm_stderr\": 0.03405502121628935,\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842892,\n \"mc2\": 0.42338274205343635,\n\
\ \"mc2_stderr\": 0.014268690462127283\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4453924914675768,\n \"acc_stderr\": 0.014523987638344085,\n\
\ \"acc_norm\": 0.4991467576791809,\n \"acc_norm_stderr\": 0.014611369529813279\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5151364270065724,\n\
\ \"acc_stderr\": 0.004987494455523726,\n \"acc_norm\": 0.72176857199761,\n\
\ \"acc_norm_stderr\": 0.004472121485161911\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.04033565667848319,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.04033565667848319\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n\
\ \"acc_stderr\": 0.0258221061194159,\n \"acc_norm\": 0.7096774193548387,\n\
\ \"acc_norm_stderr\": 0.0258221061194159\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419871,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419871\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.02977866303775296,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.02977866303775296\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.017030719339154343,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.017030719339154343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6813725490196079,\n \"acc_stderr\": 0.032702871814820816,\n \"\
acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.032702871814820816\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489298,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489298\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7266922094508301,\n\
\ \"acc_stderr\": 0.015936681062628556,\n \"acc_norm\": 0.7266922094508301,\n\
\ \"acc_norm_stderr\": 0.015936681062628556\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.025816756791584183,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.025816756791584183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809068,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809068\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.02698147804364804,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.02698147804364804\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271143,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493274,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493274\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5800653594771242,\n \"acc_stderr\": 0.019966811178256483,\n \
\ \"acc_norm\": 0.5800653594771242,\n \"acc_norm_stderr\": 0.019966811178256483\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547724,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547724\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328913,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27050183598531213,\n\
\ \"mc1_stderr\": 0.015550778332842892,\n \"mc2\": 0.42338274205343635,\n\
\ \"mc2_stderr\": 0.014268690462127283\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7198105761641673,\n \"acc_stderr\": 0.012621707979798499\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15845337376800606,\n \
\ \"acc_stderr\": 0.010058474790238955\n }\n}\n```"
repo_url: https://huggingface.co/hongzoh/Yi-6B_Open-Platypus-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-35-06.410961.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-35-06.410961.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- '**/details_harness|winogrande|5_2024-03-29T20-35-06.410961.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T20-35-06.410961.parquet'
- config_name: results
data_files:
- split: 2024_03_29T20_35_06.410961
path:
- results_2024-03-29T20-35-06.410961.parquet
- split: latest
path:
- results_2024-03-29T20-35-06.410961.parquet
---
# Dataset Card for Evaluation run of hongzoh/Yi-6B_Open-Platypus-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hongzoh/Yi-6B_Open-Platypus-v2](https://huggingface.co/hongzoh/Yi-6B_Open-Platypus-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T20:35:06.410961](https://huggingface.co/datasets/open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2/blob/main/results_2024-03-29T20-35-06.410961.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5682981374820366,
"acc_stderr": 0.03331728566573774,
"acc_norm": 0.5770899804690629,
"acc_norm_stderr": 0.03405502121628935,
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842892,
"mc2": 0.42338274205343635,
"mc2_stderr": 0.014268690462127283
},
"harness|arc:challenge|25": {
"acc": 0.4453924914675768,
"acc_stderr": 0.014523987638344085,
"acc_norm": 0.4991467576791809,
"acc_norm_stderr": 0.014611369529813279
},
"harness|hellaswag|10": {
"acc": 0.5151364270065724,
"acc_stderr": 0.004987494455523726,
"acc_norm": 0.72176857199761,
"acc_norm_stderr": 0.004472121485161911
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.04033565667848319,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.04033565667848319
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.0258221061194159,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.0258221061194159
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419871,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419871
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.02977866303775296,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.02977866303775296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.017030719339154343,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.017030719339154343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.032702871814820816,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.032702871814820816
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489298,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489298
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7266922094508301,
"acc_stderr": 0.015936681062628556,
"acc_norm": 0.7266922094508301,
"acc_norm_stderr": 0.015936681062628556
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.025816756791584183,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.025816756791584183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.02698147804364804,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.02698147804364804
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271143,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493274,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493274
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5800653594771242,
"acc_stderr": 0.019966811178256483,
"acc_norm": 0.5800653594771242,
"acc_norm_stderr": 0.019966811178256483
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547724,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547724
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328913,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27050183598531213,
"mc1_stderr": 0.015550778332842892,
"mc2": 0.42338274205343635,
"mc2_stderr": 0.014268690462127283
},
"harness|winogrande|5": {
"acc": 0.7198105761641673,
"acc_stderr": 0.012621707979798499
},
"harness|gsm8k|5": {
"acc": 0.15845337376800606,
"acc_stderr": 0.010058474790238955
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_TheDrummer__Moistral-11B-v2 | open-llm-leaderboard-old | "2024-03-29T20:55:06Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-29T20:54:43Z" | ---
pretty_name: Evaluation run of TheDrummer/Moistral-11B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheDrummer/Moistral-11B-v2](https://huggingface.co/TheDrummer/Moistral-11B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheDrummer__Moistral-11B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T20:52:23.439068](https://huggingface.co/datasets/open-llm-leaderboard/details_TheDrummer__Moistral-11B-v2/blob/main/results_2024-03-29T20-52-23.439068.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3911590963438393,\n\
\ \"acc_stderr\": 0.034035075847598746,\n \"acc_norm\": 0.39672829205259913,\n\
\ \"acc_norm_stderr\": 0.03496091965958944,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.42896944225507627,\n\
\ \"mc2_stderr\": 0.0153908253979801\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4112627986348123,\n \"acc_stderr\": 0.014379441068522078,\n\
\ \"acc_norm\": 0.4513651877133106,\n \"acc_norm_stderr\": 0.014542104569955258\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5332603067118104,\n\
\ \"acc_stderr\": 0.00497872930007489,\n \"acc_norm\": 0.7189802828121888,\n\
\ \"acc_norm_stderr\": 0.004485784468576678\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4339622641509434,\n \"acc_stderr\": 0.030503292013342596,\n\
\ \"acc_norm\": 0.4339622641509434,\n \"acc_norm_stderr\": 0.030503292013342596\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n\
\ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n\
\ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.0314108219759624,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.0314108219759624\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432562,\n \"\
acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432562\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.45161290322580644,\n \"acc_stderr\": 0.028310500348568378,\n \"\
acc_norm\": 0.45161290322580644,\n \"acc_norm_stderr\": 0.028310500348568378\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"\
acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.0347327959083696,\n \"acc_norm\"\
: 0.3888888888888889,\n \"acc_norm_stderr\": 0.0347327959083696\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.5492227979274611,\n \"acc_stderr\": 0.035909109522355244,\n\
\ \"acc_norm\": 0.5492227979274611,\n \"acc_norm_stderr\": 0.035909109522355244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3974358974358974,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.3974358974358974,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.03175367846096625,\n \
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.03175367846096625\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804724,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804724\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4990825688073395,\n \"acc_stderr\": 0.021437287056051215,\n \"\
acc_norm\": 0.4990825688073395,\n \"acc_norm_stderr\": 0.021437287056051215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536037,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536037\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.47058823529411764,\n \"acc_stderr\": 0.03503235296367992,\n \"\
acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5189873417721519,\n \"acc_stderr\": 0.03252375148090448,\n \
\ \"acc_norm\": 0.5189873417721519,\n \"acc_norm_stderr\": 0.03252375148090448\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3511450381679389,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.3511450381679389,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319773,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319773\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.0383674090783103,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.0383674090783103\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.04950504382128921,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.04950504382128921\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.03255326307272485,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.03255326307272485\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4878671775223499,\n\
\ \"acc_stderr\": 0.01787469866749134,\n \"acc_norm\": 0.4878671775223499,\n\
\ \"acc_norm_stderr\": 0.01787469866749134\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.028275490156791438,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.028275490156791438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3858520900321543,\n\
\ \"acc_stderr\": 0.02764814959975146,\n \"acc_norm\": 0.3858520900321543,\n\
\ \"acc_norm_stderr\": 0.02764814959975146\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3765432098765432,\n \"acc_stderr\": 0.026959344518747784,\n\
\ \"acc_norm\": 0.3765432098765432,\n \"acc_norm_stderr\": 0.026959344518747784\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.027640120545169927,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.027640120545169927\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32659713168187743,\n\
\ \"acc_stderr\": 0.011977676704715992,\n \"acc_norm\": 0.32659713168187743,\n\
\ \"acc_norm_stderr\": 0.011977676704715992\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35130718954248363,\n \"acc_stderr\": 0.01931267606578655,\n \
\ \"acc_norm\": 0.35130718954248363,\n \"acc_norm_stderr\": 0.01931267606578655\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n\
\ \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n\
\ \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.43673469387755104,\n \"acc_stderr\": 0.031751952375833226,\n\
\ \"acc_norm\": 0.43673469387755104,\n \"acc_norm_stderr\": 0.031751952375833226\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5472636815920398,\n\
\ \"acc_stderr\": 0.03519702717576915,\n \"acc_norm\": 0.5472636815920398,\n\
\ \"acc_norm_stderr\": 0.03519702717576915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4678362573099415,\n \"acc_stderr\": 0.038268824176603704,\n\
\ \"acc_norm\": 0.4678362573099415,\n \"acc_norm_stderr\": 0.038268824176603704\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.42896944225507627,\n\
\ \"mc2_stderr\": 0.0153908253979801\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6795580110497238,\n \"acc_stderr\": 0.013115085457681712\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/TheDrummer/Moistral-11B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-52-23.439068.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T20-52-23.439068.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- '**/details_harness|winogrande|5_2024-03-29T20-52-23.439068.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T20-52-23.439068.parquet'
- config_name: results
data_files:
- split: 2024_03_29T20_52_23.439068
path:
- results_2024-03-29T20-52-23.439068.parquet
- split: latest
path:
- results_2024-03-29T20-52-23.439068.parquet
---
# Dataset Card for Evaluation run of TheDrummer/Moistral-11B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TheDrummer/Moistral-11B-v2](https://huggingface.co/TheDrummer/Moistral-11B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheDrummer__Moistral-11B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T20:52:23.439068](https://huggingface.co/datasets/open-llm-leaderboard/details_TheDrummer__Moistral-11B-v2/blob/main/results_2024-03-29T20-52-23.439068.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3911590963438393,
"acc_stderr": 0.034035075847598746,
"acc_norm": 0.39672829205259913,
"acc_norm_stderr": 0.03496091965958944,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.42896944225507627,
"mc2_stderr": 0.0153908253979801
},
"harness|arc:challenge|25": {
"acc": 0.4112627986348123,
"acc_stderr": 0.014379441068522078,
"acc_norm": 0.4513651877133106,
"acc_norm_stderr": 0.014542104569955258
},
"harness|hellaswag|10": {
"acc": 0.5332603067118104,
"acc_stderr": 0.00497872930007489,
"acc_norm": 0.7189802828121888,
"acc_norm_stderr": 0.004485784468576678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4339622641509434,
"acc_stderr": 0.030503292013342596,
"acc_norm": 0.4339622641509434,
"acc_norm_stderr": 0.030503292013342596
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.0314108219759624,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.0314108219759624
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432562,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432562
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45161290322580644,
"acc_stderr": 0.028310500348568378,
"acc_norm": 0.45161290322580644,
"acc_norm_stderr": 0.028310500348568378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0347327959083696,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0347327959083696
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5492227979274611,
"acc_stderr": 0.035909109522355244,
"acc_norm": 0.5492227979274611,
"acc_norm_stderr": 0.035909109522355244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3974358974358974,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.3974358974358974,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804724,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804724
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4990825688073395,
"acc_stderr": 0.021437287056051215,
"acc_norm": 0.4990825688073395,
"acc_norm_stderr": 0.021437287056051215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536037,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536037
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5189873417721519,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.5189873417721519,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4304932735426009,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.4304932735426009,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3511450381679389,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.3511450381679389,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319773,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319773
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.0383674090783103,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.0383674090783103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.04950504382128921,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.04950504382128921
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03255326307272485,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03255326307272485
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4878671775223499,
"acc_stderr": 0.01787469866749134,
"acc_norm": 0.4878671775223499,
"acc_norm_stderr": 0.01787469866749134
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.028275490156791438,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.028275490156791438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3858520900321543,
"acc_stderr": 0.02764814959975146,
"acc_norm": 0.3858520900321543,
"acc_norm_stderr": 0.02764814959975146
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3765432098765432,
"acc_stderr": 0.026959344518747784,
"acc_norm": 0.3765432098765432,
"acc_norm_stderr": 0.026959344518747784
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.027640120545169927,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.027640120545169927
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32659713168187743,
"acc_stderr": 0.011977676704715992,
"acc_norm": 0.32659713168187743,
"acc_norm_stderr": 0.011977676704715992
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35130718954248363,
"acc_stderr": 0.01931267606578655,
"acc_norm": 0.35130718954248363,
"acc_norm_stderr": 0.01931267606578655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.43673469387755104,
"acc_stderr": 0.031751952375833226,
"acc_norm": 0.43673469387755104,
"acc_norm_stderr": 0.031751952375833226
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5472636815920398,
"acc_stderr": 0.03519702717576915,
"acc_norm": 0.5472636815920398,
"acc_norm_stderr": 0.03519702717576915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4678362573099415,
"acc_stderr": 0.038268824176603704,
"acc_norm": 0.4678362573099415,
"acc_norm_stderr": 0.038268824176603704
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766372,
"mc2": 0.42896944225507627,
"mc2_stderr": 0.0153908253979801
},
"harness|winogrande|5": {
"acc": 0.6795580110497238,
"acc_stderr": 0.013115085457681712
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k | open-llm-leaderboard-old | "2024-03-29T21:12:50Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-29T21:12:27Z" | ---
pretty_name: Evaluation run of JunchengXie/Starling-LM-7B-alpha-gpt-4-80k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JunchengXie/Starling-LM-7B-alpha-gpt-4-80k](https://huggingface.co/JunchengXie/Starling-LM-7B-alpha-gpt-4-80k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:10:03.480531](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k/blob/main/results_2024-03-29T21-10-03.480531.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6427400931197095,\n\
\ \"acc_stderr\": 0.03222503970792139,\n \"acc_norm\": 0.6448752594792302,\n\
\ \"acc_norm_stderr\": 0.03287014856411763,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5435341327101476,\n\
\ \"mc2_stderr\": 0.015323454299145556\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.014356399418009126,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6110336586337383,\n\
\ \"acc_stderr\": 0.004865193237024046,\n \"acc_norm\": 0.8127862975502887,\n\
\ \"acc_norm_stderr\": 0.0038928576150164744\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723285,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723285\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n\
\ \"acc_stderr\": 0.01612554382355294,\n \"acc_norm\": 0.3675977653631285,\n\
\ \"acc_norm_stderr\": 0.01612554382355294\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032207,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032207\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5435341327101476,\n\
\ \"mc2_stderr\": 0.015323454299145556\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856546\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6307808946171342,\n \
\ \"acc_stderr\": 0.013293019538066244\n }\n}\n```"
repo_url: https://huggingface.co/JunchengXie/Starling-LM-7B-alpha-gpt-4-80k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-10-03.480531.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-10-03.480531.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- '**/details_harness|winogrande|5_2024-03-29T21-10-03.480531.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-10-03.480531.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_10_03.480531
path:
- results_2024-03-29T21-10-03.480531.parquet
- split: latest
path:
- results_2024-03-29T21-10-03.480531.parquet
---
# Dataset Card for Evaluation run of JunchengXie/Starling-LM-7B-alpha-gpt-4-80k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JunchengXie/Starling-LM-7B-alpha-gpt-4-80k](https://huggingface.co/JunchengXie/Starling-LM-7B-alpha-gpt-4-80k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:10:03.480531](https://huggingface.co/datasets/open-llm-leaderboard/details_JunchengXie__Starling-LM-7B-alpha-gpt-4-80k/blob/main/results_2024-03-29T21-10-03.480531.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6427400931197095,
"acc_stderr": 0.03222503970792139,
"acc_norm": 0.6448752594792302,
"acc_norm_stderr": 0.03287014856411763,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5435341327101476,
"mc2_stderr": 0.015323454299145556
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.014356399418009126,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6110336586337383,
"acc_stderr": 0.004865193237024046,
"acc_norm": 0.8127862975502887,
"acc_norm_stderr": 0.0038928576150164744
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138215,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723285,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723285
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.01612554382355294,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.01612554382355294
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032207,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032207
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5435341327101476,
"mc2_stderr": 0.015323454299145556
},
"harness|winogrande|5": {
"acc": 0.7671665351223362,
"acc_stderr": 0.011878201073856546
},
"harness|gsm8k|5": {
"acc": 0.6307808946171342,
"acc_stderr": 0.013293019538066244
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_FredrikBL__test-dare | open-llm-leaderboard-old | "2024-03-29T21:21:50Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-29T21:21:28Z" | ---
pretty_name: Evaluation run of FredrikBL/test-dare
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FredrikBL/test-dare](https://huggingface.co/FredrikBL/test-dare) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FredrikBL__test-dare\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T21:19:13.154293](https://huggingface.co/datasets/open-llm-leaderboard/details_FredrikBL__test-dare/blob/main/results_2024-03-29T21-19-13.154293.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6460331293814567,\n\
\ \"acc_stderr\": 0.03224922642756375,\n \"acc_norm\": 0.6478142381132488,\n\
\ \"acc_norm_stderr\": 0.032903872728238165,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.01680186046667715,\n \"mc2\": 0.5268971269435854,\n\
\ \"mc2_stderr\": 0.015057816486907058\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938169,\n\
\ \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.01397545412275656\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6526588329018124,\n\
\ \"acc_stderr\": 0.004751522127418455,\n \"acc_norm\": 0.8487353116908982,\n\
\ \"acc_norm_stderr\": 0.003575744098779938\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119668,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119668\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033463,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033463\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899126,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899126\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n\
\ \"acc_stderr\": 0.015937484656687033,\n \"acc_norm\": 0.3486033519553073,\n\
\ \"acc_norm_stderr\": 0.015937484656687033\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667885,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667885\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532067,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.01680186046667715,\n \"mc2\": 0.5268971269435854,\n\
\ \"mc2_stderr\": 0.015057816486907058\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \
\ \"acc_stderr\": 0.01342838248127422\n }\n}\n```"
repo_url: https://huggingface.co/FredrikBL/test-dare
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-19-13.154293.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T21-19-13.154293.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- '**/details_harness|winogrande|5_2024-03-29T21-19-13.154293.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T21-19-13.154293.parquet'
- config_name: results
data_files:
- split: 2024_03_29T21_19_13.154293
path:
- results_2024-03-29T21-19-13.154293.parquet
- split: latest
path:
- results_2024-03-29T21-19-13.154293.parquet
---
# Dataset Card for Evaluation run of FredrikBL/test-dare
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FredrikBL/test-dare](https://huggingface.co/FredrikBL/test-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FredrikBL__test-dare",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T21:19:13.154293](https://huggingface.co/datasets/open-llm-leaderboard/details_FredrikBL__test-dare/blob/main/results_2024-03-29T21-19-13.154293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6460331293814567,
"acc_stderr": 0.03224922642756375,
"acc_norm": 0.6478142381132488,
"acc_norm_stderr": 0.032903872728238165,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667715,
"mc2": 0.5268971269435854,
"mc2_stderr": 0.015057816486907058
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938169,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.01397545412275656
},
"harness|hellaswag|10": {
"acc": 0.6526588329018124,
"acc_stderr": 0.004751522127418455,
"acc_norm": 0.8487353116908982,
"acc_norm_stderr": 0.003575744098779938
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119668,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119668
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033463,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033463
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899126,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899126
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.015937484656687033,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.015937484656687033
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667885,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667885
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532067,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667715,
"mc2": 0.5268971269435854,
"mc2_stderr": 0.015057816486907058
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242912
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.01342838248127422
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Jaoze/gohanvocals | Jaoze | "2024-03-29T21:55:34Z" | 30 | 0 | [
"license:openrail",
"region:us"
] | null | "2024-03-29T21:54:45Z" | ---
license: openrail
---
|
Rimyy/Math-llama2-200k | Rimyy | "2024-03-29T21:57:50Z" | 30 | 1 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-29T21:57:39Z" | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 225322861
num_examples: 200035
download_size: 84227576
dataset_size: 225322861
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SeungmoKu/llama_ksm_kor | SeungmoKu | "2024-03-29T22:32:45Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-29T22:32:25Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AlanYky/subjective-no-instruction-with-symbol | AlanYky | "2024-03-30T03:53:54Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-30T03:53:53Z" | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 735404
num_examples: 500
download_size: 333332
dataset_size: 735404
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/ivy_saijakutamerwagomihiroinotabiwohajimemashita | CyberHarem | "2024-03-30T05:58:57Z" | 30 | 0 | [
"task_categories:text-to-image",
"license:mit",
"size_categories:1K<n<10K",
"library:datasets",
"library:mlcroissant",
"region:us",
"art",
"not-for-all-audiences"
] | [
"text-to-image"
] | "2024-03-30T05:02:17Z" | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ivy/アイビー (Saijaku Tamer wa Gomi Hiroi no Tabi wo Hajimemashita)
This is the dataset of Ivy/アイビー (Saijaku Tamer wa Gomi Hiroi no Tabi wo Hajimemashita), containing 482 images and their tags.
The core tags of this character are `green_hair, short_hair, brown_eyes, ponytail, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 482 | 377.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ivy_saijakutamerwagomihiroinotabiwohajimemashita/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 482 | 376.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ivy_saijakutamerwagomihiroinotabiwohajimemashita/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 971 | 665.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ivy_saijakutamerwagomihiroinotabiwohajimemashita/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ivy_saijakutamerwagomihiroinotabiwohajimemashita',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, closed_mouth, outdoors, hood, solo, tree, forest, smile, anime_coloring, day, looking_at_viewer, blurry_background, blush, portrait |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, blush, closed_mouth, hood, portrait, solo, looking_at_viewer, anime_coloring, frown, yellow_eyes |
| 2 | 13 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, closed_mouth, solo, hood, smile, upper_body, blush, looking_at_viewer |
| 3 | 27 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, long_sleeves, outdoors, solo, white_cloak, yellow_eyes, shoulder_bag, hood_up, open_mouth, tree, forest, looking_at_viewer, pantyhose, day |
| 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, blue_sky, day, outdoors, solo, hood_up, open_mouth, upper_body, cloud, looking_at_viewer, white_cloak, animal_hood, long_sleeves |
| 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | closed_mouth, forest, outdoors, slime_(creature), 1girl, tree, yellow_eyes, smile, 1other, upper_body, white_cape, white_cloak |
| 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, solo_focus, 1boy, closed_mouth, hood, from_side, profile, yellow_eyes, short_ponytail, smile |
| 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, open_mouth, portrait, solo, hood, looking_at_viewer, upper_teeth_only, close-up |
| 8 | 11 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, long_hair, solo, child, open_mouth, puffy_short_sleeves, aged_down, collared_dress, pink_dress, upper_body, :d, blush, outdoors |
| 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 1girl, bag, long_sleeves, open_mouth, peach, 1boy, apple, grapes, hood_down, smile, solo, blue_hair, brown_gloves, holding_fruit |
| 10 | 8 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | 1girl, blue_sky, day, long_hair, outdoors, solo, cloud, blurry_background, flower, looking_at_viewer, torn_clothes, floating_hair, petals, puffy_short_sleeves, depth_of_field, open_mouth, wind, aged_down, bag, child, collared_dress, grass |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | outdoors | hood | solo | tree | forest | smile | anime_coloring | day | looking_at_viewer | blurry_background | blush | portrait | frown | yellow_eyes | upper_body | long_sleeves | white_cloak | shoulder_bag | hood_up | open_mouth | pantyhose | blue_sky | cloud | animal_hood | slime_(creature) | 1other | white_cape | solo_focus | 1boy | from_side | profile | short_ponytail | upper_teeth_only | close-up | long_hair | child | puffy_short_sleeves | aged_down | collared_dress | pink_dress | :d | bag | peach | apple | grapes | hood_down | blue_hair | brown_gloves | holding_fruit | flower | torn_clothes | floating_hair | petals | depth_of_field | wind | grass |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:-----------|:-------|:-------|:-------|:---------|:--------|:-----------------|:------|:--------------------|:--------------------|:--------|:-----------|:--------|:--------------|:-------------|:---------------|:--------------|:---------------|:----------|:-------------|:------------|:-----------|:--------|:--------------|:-------------------|:---------|:-------------|:-------------|:-------|:------------|:----------|:-----------------|:-------------------|:-----------|:------------|:--------|:----------------------|:------------|:-----------------|:-------------|:-----|:------|:--------|:--------|:---------|:------------|:------------|:---------------|:----------------|:---------|:---------------|:----------------|:---------|:-----------------|:-------|:--------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | | | | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | X | X | | | X | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 27 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | X | | X | X | X | | | X | X | | | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | | X | | X | | | | | X | X | | | | | | X | X | X | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 7 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | X | | | X | X | X | | | | | | | | X | X | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | X | | | | X | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | | | X | X | | | | | | X | | | X | | | | | | | | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 11 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | X | | X | | | | | | | | X | | | | X | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | | | | X | | | X | | | | | | | | | | X | | | | X | | | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | |
| 10 | 8 | ![](samples/10/clu10-sample0.png) | ![](samples/10/clu10-sample1.png) | ![](samples/10/clu10-sample2.png) | ![](samples/10/clu10-sample3.png) | ![](samples/10/clu10-sample4.png) | X | | X | | X | | | | | X | X | X | | | | | | | | | | X | | X | X | | | | | | | | | | | | X | X | X | X | X | | | X | | | | | | | | X | X | X | X | X | X | X |
|
snowfly/processed_demo | snowfly | "2024-03-30T06:32:00Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-30T06:31:48Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 50
num_examples: 2
- name: validation
num_bytes: 50
num_examples: 2
- name: test
num_bytes: 50
num_examples: 2
download_size: 3987
dataset_size: 150
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ttaront/filtered_slimpa | ttaront | "2024-04-05T12:29:43Z" | 30 | 0 | [
"task_categories:text-generation",
"language:en",
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"text-generation"
] | "2024-03-30T10:41:45Z" | ---
task_categories:
- text-generation
language:
- en
pretty_name: filtered_slimpa
--- |
jdsannchao/non_existent | jdsannchao | "2024-03-30T12:08:59Z" | 30 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-30T12:08:35Z" | ---
dataset_info:
- config_name: attr_qa
features:
- name: img_id
dtype: int64
- name: orig_qa
dtype: string
- name: question_text
dtype: string
- name: answer_text
dtype: string
splits:
- name: train
num_bytes: 43062088
num_examples: 704759
download_size: 12017273
dataset_size: 43062088
- config_name: exist_qa
features:
- name: img_id
dtype: int64
- name: orig_qa
dtype: string
- name: question_text
dtype: string
- name: answer_text
dtype: string
splits:
- name: train
num_bytes: 50290552
num_examples: 733586
download_size: 13928584
dataset_size: 50290552
- config_name: relation_qa
features:
- name: img_id
dtype: int64
- name: orig_qa
dtype: string
- name: question_text
dtype: string
- name: answer_text
dtype: string
splits:
- name: train
num_bytes: 48465571
num_examples: 712248
download_size: 14150304
dataset_size: 48465571
configs:
- config_name: attr_qa
data_files:
- split: train
path: attr_qa/train-*
- config_name: exist_qa
data_files:
- split: train
path: exist_qa/train-*
- config_name: relation_qa
data_files:
- split: train
path: relation_qa/train-*
---
|
eswanYS/customhkcode2 | eswanYS | "2024-03-30T14:40:11Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-30T14:38:53Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_invalid-coder__TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo | open-llm-leaderboard-old | "2024-03-30T15:15:50Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-30T15:15:29Z" | ---
pretty_name: Evaluation run of invalid-coder/TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [invalid-coder/TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo](https://huggingface.co/invalid-coder/TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_invalid-coder__TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T15:13:38.324226](https://huggingface.co/datasets/open-llm-leaderboard/details_invalid-coder__TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo/blob/main/results_2024-03-30T15-13-38.324226.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2735286860324725,\n\
\ \"acc_stderr\": 0.03143769848552324,\n \"acc_norm\": 0.2754881804383971,\n\
\ \"acc_norm_stderr\": 0.03222142068698204,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.38078325221080583,\n\
\ \"mc2_stderr\": 0.01386769746146585\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30887372013651876,\n \"acc_stderr\": 0.013501770929344003,\n\
\ \"acc_norm\": 0.3302047781569966,\n \"acc_norm_stderr\": 0.013743085603760427\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4447321250746863,\n\
\ \"acc_stderr\": 0.004959204773046204,\n \"acc_norm\": 0.5999800836486756,\n\
\ \"acc_norm_stderr\": 0.004889007921214687\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.14473684210526316,\n \"acc_stderr\": 0.028631951845930394,\n\
\ \"acc_norm\": 0.14473684210526316,\n \"acc_norm_stderr\": 0.028631951845930394\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724064,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n\
\ \"acc_stderr\": 0.03533133389323657,\n \"acc_norm\": 0.31213872832369943,\n\
\ \"acc_norm_stderr\": 0.03533133389323657\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.02964400657700962,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.02964400657700962\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.03835153954399419,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.03835153954399419\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.291005291005291,\n \"acc_stderr\": 0.02339382650048488,\n \"acc_norm\"\
: 0.291005291005291,\n \"acc_norm_stderr\": 0.02339382650048488\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.02468597928623997,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.02468597928623997\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.32727272727272727,\n \"acc_stderr\": 0.03663974994391243,\n\
\ \"acc_norm\": 0.32727272727272727,\n \"acc_norm_stderr\": 0.03663974994391243\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2474747474747475,\n \"acc_stderr\": 0.0307463007421245,\n \"acc_norm\"\
: 0.2474747474747475,\n \"acc_norm_stderr\": 0.0307463007421245\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.03097543638684543,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.03097543638684543\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.02329088805377274,\n\
\ \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.02329088805377274\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24220183486238533,\n \"acc_stderr\": 0.018368176306598618,\n \"\
acc_norm\": 0.24220183486238533,\n \"acc_norm_stderr\": 0.018368176306598618\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.03096451792692341,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.03096451792692341\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.22784810126582278,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.22784810126582278,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755805,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755805\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2912621359223301,\n \"acc_stderr\": 0.04498676320572921,\n\
\ \"acc_norm\": 0.2912621359223301,\n \"acc_norm_stderr\": 0.04498676320572921\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n\
\ \"acc_stderr\": 0.015246803197398691,\n \"acc_norm\": 0.2388250319284802,\n\
\ \"acc_norm_stderr\": 0.015246803197398691\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22254335260115607,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.22254335260115607,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n\
\ \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.2829581993569132,\n\
\ \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290392,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290392\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.22359843546284225,\n\
\ \"acc_stderr\": 0.010641589542841378,\n \"acc_norm\": 0.22359843546284225,\n\
\ \"acc_norm_stderr\": 0.010641589542841378\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.0292894134094032,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.0292894134094032\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2238562091503268,\n \"acc_stderr\": 0.016863008585416613,\n \
\ \"acc_norm\": 0.2238562091503268,\n \"acc_norm_stderr\": 0.016863008585416613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.023897144768914524,\n\
\ \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.023897144768914524\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.38078325221080583,\n\
\ \"mc2_stderr\": 0.01386769746146585\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.595895816890292,\n \"acc_stderr\": 0.013791610664670852\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \
\ \"acc_stderr\": 0.003015294242890954\n }\n}\n```"
repo_url: https://huggingface.co/invalid-coder/TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-13-38.324226.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-13-38.324226.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- '**/details_harness|winogrande|5_2024-03-30T15-13-38.324226.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T15-13-38.324226.parquet'
- config_name: results
data_files:
- split: 2024_03_30T15_13_38.324226
path:
- results_2024-03-30T15-13-38.324226.parquet
- split: latest
path:
- results_2024-03-30T15-13-38.324226.parquet
---
# Dataset Card for Evaluation run of invalid-coder/TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [invalid-coder/TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo](https://huggingface.co/invalid-coder/TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_invalid-coder__TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T15:13:38.324226](https://huggingface.co/datasets/open-llm-leaderboard/details_invalid-coder__TinyLlama-1.1B-intermediate-step-1431k-3T-laser-dpo/blob/main/results_2024-03-30T15-13-38.324226.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2735286860324725,
"acc_stderr": 0.03143769848552324,
"acc_norm": 0.2754881804383971,
"acc_norm_stderr": 0.03222142068698204,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.38078325221080583,
"mc2_stderr": 0.01386769746146585
},
"harness|arc:challenge|25": {
"acc": 0.30887372013651876,
"acc_stderr": 0.013501770929344003,
"acc_norm": 0.3302047781569966,
"acc_norm_stderr": 0.013743085603760427
},
"harness|hellaswag|10": {
"acc": 0.4447321250746863,
"acc_stderr": 0.004959204773046204,
"acc_norm": 0.5999800836486756,
"acc_norm_stderr": 0.004889007921214687
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678318,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678318
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.14473684210526316,
"acc_stderr": 0.028631951845930394,
"acc_norm": 0.14473684210526316,
"acc_norm_stderr": 0.028631951845930394
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724064,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03835153954399419,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03835153954399419
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.291005291005291,
"acc_stderr": 0.02339382650048488,
"acc_norm": 0.291005291005291,
"acc_norm_stderr": 0.02339382650048488
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.02468597928623997,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.02468597928623997
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.03663974994391243,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.03663974994391243
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.0307463007421245,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.0307463007421245
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.03097543638684543,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.03097543638684543
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.02329088805377274,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.02329088805377274
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24220183486238533,
"acc_stderr": 0.018368176306598618,
"acc_norm": 0.24220183486238533,
"acc_norm_stderr": 0.018368176306598618
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22784810126582278,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.22784810126582278,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094634,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094634
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755805,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755805
},
"harness|hendrycksTest-management|5": {
"acc": 0.2912621359223301,
"acc_stderr": 0.04498676320572921,
"acc_norm": 0.2912621359223301,
"acc_norm_stderr": 0.04498676320572921
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398691,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398691
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22254335260115607,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.22254335260115607,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290392,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290392
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.22359843546284225,
"acc_stderr": 0.010641589542841378,
"acc_norm": 0.22359843546284225,
"acc_norm_stderr": 0.010641589542841378
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.0292894134094032,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.0292894134094032
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2238562091503268,
"acc_stderr": 0.016863008585416613,
"acc_norm": 0.2238562091503268,
"acc_norm_stderr": 0.016863008585416613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1673469387755102,
"acc_stderr": 0.023897144768914524,
"acc_norm": 0.1673469387755102,
"acc_norm_stderr": 0.023897144768914524
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.38078325221080583,
"mc2_stderr": 0.01386769746146585
},
"harness|winogrande|5": {
"acc": 0.595895816890292,
"acc_stderr": 0.013791610664670852
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.003015294242890954
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_invalid-coder__dolphin-2.1-mistral-7b-snr-laser | open-llm-leaderboard-old | "2024-03-30T15:25:43Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-30T15:25:22Z" | ---
pretty_name: Evaluation run of invalid-coder/dolphin-2.1-mistral-7b-snr-laser
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [invalid-coder/dolphin-2.1-mistral-7b-snr-laser](https://huggingface.co/invalid-coder/dolphin-2.1-mistral-7b-snr-laser)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_invalid-coder__dolphin-2.1-mistral-7b-snr-laser\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T15:22:56.924782](https://huggingface.co/datasets/open-llm-leaderboard/details_invalid-coder__dolphin-2.1-mistral-7b-snr-laser/blob/main/results_2024-03-30T15-22-56.924782.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6358975052319833,\n\
\ \"acc_stderr\": 0.03234076317798482,\n \"acc_norm\": 0.6399480411472922,\n\
\ \"acc_norm_stderr\": 0.03298383252796686,\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.016943535128405334,\n \"mc2\": 0.5524167978867858,\n\
\ \"mc2_stderr\": 0.01534370270886878\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946709,\n\
\ \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038085\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6629157538338977,\n\
\ \"acc_stderr\": 0.004717478335689631,\n \"acc_norm\": 0.8478390758812986,\n\
\ \"acc_norm_stderr\": 0.0035844274905793678\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"\
acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.016465345467391528,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.016465345467391528\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251745,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251745\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990945,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990945\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407003,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407003\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39776536312849164,\n\
\ \"acc_stderr\": 0.01636920497126298,\n \"acc_norm\": 0.39776536312849164,\n\
\ \"acc_norm_stderr\": 0.01636920497126298\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781873,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.016943535128405334,\n \"mc2\": 0.5524167978867858,\n\
\ \"mc2_stderr\": 0.01534370270886878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209411\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4723275208491281,\n \
\ \"acc_stderr\": 0.013751375538801326\n }\n}\n```"
repo_url: https://huggingface.co/invalid-coder/dolphin-2.1-mistral-7b-snr-laser
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-22-56.924782.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-22-56.924782.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- '**/details_harness|winogrande|5_2024-03-30T15-22-56.924782.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T15-22-56.924782.parquet'
- config_name: results
data_files:
- split: 2024_03_30T15_22_56.924782
path:
- results_2024-03-30T15-22-56.924782.parquet
- split: latest
path:
- results_2024-03-30T15-22-56.924782.parquet
---
# Dataset Card for Evaluation run of invalid-coder/dolphin-2.1-mistral-7b-snr-laser
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [invalid-coder/dolphin-2.1-mistral-7b-snr-laser](https://huggingface.co/invalid-coder/dolphin-2.1-mistral-7b-snr-laser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_invalid-coder__dolphin-2.1-mistral-7b-snr-laser",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T15:22:56.924782](https://huggingface.co/datasets/open-llm-leaderboard/details_invalid-coder__dolphin-2.1-mistral-7b-snr-laser/blob/main/results_2024-03-30T15-22-56.924782.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6358975052319833,
"acc_stderr": 0.03234076317798482,
"acc_norm": 0.6399480411472922,
"acc_norm_stderr": 0.03298383252796686,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405334,
"mc2": 0.5524167978867858,
"mc2_stderr": 0.01534370270886878
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946709,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038085
},
"harness|hellaswag|10": {
"acc": 0.6629157538338977,
"acc_stderr": 0.004717478335689631,
"acc_norm": 0.8478390758812986,
"acc_norm_stderr": 0.0035844274905793678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.016465345467391528,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.016465345467391528
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251745,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251745
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990945,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990945
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407003,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407003
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39776536312849164,
"acc_stderr": 0.01636920497126298,
"acc_norm": 0.39776536312849164,
"acc_norm_stderr": 0.01636920497126298
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781873,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765137,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405334,
"mc2": 0.5524167978867858,
"mc2_stderr": 0.01534370270886878
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209411
},
"harness|gsm8k|5": {
"acc": 0.4723275208491281,
"acc_stderr": 0.013751375538801326
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Joseph717171__Genstruct-10.7B | open-llm-leaderboard-old | "2024-03-30T15:53:30Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-30T15:53:06Z" | ---
pretty_name: Evaluation run of Joseph717171/Genstruct-10.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Joseph717171/Genstruct-10.7B](https://huggingface.co/Joseph717171/Genstruct-10.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Joseph717171__Genstruct-10.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T15:50:47.030919](https://huggingface.co/datasets/open-llm-leaderboard/details_Joseph717171__Genstruct-10.7B/blob/main/results_2024-03-30T15-50-47.030919.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6058365742339414,\n\
\ \"acc_stderr\": 0.03286052164816604,\n \"acc_norm\": 0.6065695629397805,\n\
\ \"acc_norm_stderr\": 0.033526233034810754,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.4666302750761303,\n\
\ \"mc2_stderr\": 0.015225617830989736\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.014467631559137996,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938213\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6475801633140809,\n\
\ \"acc_stderr\": 0.004767475366689767,\n \"acc_norm\": 0.8281218880701056,\n\
\ \"acc_norm_stderr\": 0.0037650342861534386\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113728,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113728\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.01626567563201036,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.01626567563201036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.04284467968052194,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.04284467968052194\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560417,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560417\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333567,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165555,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.01498732543996355,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.01498732543996355\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804015,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804015\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n\
\ \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778855,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4028683181225554,\n\
\ \"acc_stderr\": 0.012526955577118016,\n \"acc_norm\": 0.4028683181225554,\n\
\ \"acc_norm_stderr\": 0.012526955577118016\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6078431372549019,\n \"acc_stderr\": 0.01975172650876264,\n \
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.01975172650876264\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587952,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587952\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160896,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160896\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.01613222972815504,\n \"mc2\": 0.4666302750761303,\n\
\ \"mc2_stderr\": 0.015225617830989736\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827934\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \
\ \"acc_stderr\": 0.013373971277729815\n }\n}\n```"
repo_url: https://huggingface.co/Joseph717171/Genstruct-10.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-50-47.030919.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T15-50-47.030919.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- '**/details_harness|winogrande|5_2024-03-30T15-50-47.030919.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T15-50-47.030919.parquet'
- config_name: results
data_files:
- split: 2024_03_30T15_50_47.030919
path:
- results_2024-03-30T15-50-47.030919.parquet
- split: latest
path:
- results_2024-03-30T15-50-47.030919.parquet
---
# Dataset Card for Evaluation run of Joseph717171/Genstruct-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Joseph717171/Genstruct-10.7B](https://huggingface.co/Joseph717171/Genstruct-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Joseph717171__Genstruct-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T15:50:47.030919](https://huggingface.co/datasets/open-llm-leaderboard/details_Joseph717171__Genstruct-10.7B/blob/main/results_2024-03-30T15-50-47.030919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6058365742339414,
"acc_stderr": 0.03286052164816604,
"acc_norm": 0.6065695629397805,
"acc_norm_stderr": 0.033526233034810754,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.4666302750761303,
"mc2_stderr": 0.015225617830989736
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.014467631559137996,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938213
},
"harness|hellaswag|10": {
"acc": 0.6475801633140809,
"acc_stderr": 0.004767475366689767,
"acc_norm": 0.8281218880701056,
"acc_norm_stderr": 0.0037650342861534386
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113728,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113728
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.01626567563201036,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.01626567563201036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.04284467968052194,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.04284467968052194
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560417,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560417
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333567,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.025416003773165555,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.025416003773165555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.01498732543996355,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.01498732543996355
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804015,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804015
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778855,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4028683181225554,
"acc_stderr": 0.012526955577118016,
"acc_norm": 0.4028683181225554,
"acc_norm_stderr": 0.012526955577118016
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.01975172650876264,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.01975172650876264
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587952,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587952
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160896,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160896
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.01613222972815504,
"mc2": 0.4666302750761303,
"mc2_stderr": 0.015225617830989736
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827934
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_ChaoticNeutrals__RPMix-4x7B-MoE | open-llm-leaderboard-old | "2024-03-30T16:21:35Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-30T16:19:58Z" | ---
pretty_name: Evaluation run of ChaoticNeutrals/RPMix-4x7B-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChaoticNeutrals/RPMix-4x7B-MoE](https://huggingface.co/ChaoticNeutrals/RPMix-4x7B-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChaoticNeutrals__RPMix-4x7B-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-30T16:19:12.877263](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__RPMix-4x7B-MoE/blob/main/results_2024-03-30T16-19-12.877263.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6480977209038945,\n\
\ \"acc_stderr\": 0.03214786369304485,\n \"acc_norm\": 0.648718332669649,\n\
\ \"acc_norm_stderr\": 0.03280218159776964,\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6728962085128464,\n\
\ \"mc2_stderr\": 0.015201269488389688\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173307,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7120095598486357,\n\
\ \"acc_stderr\": 0.0045190116884171695,\n \"acc_norm\": 0.877912766381199,\n\
\ \"acc_norm_stderr\": 0.0032671744584497567\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342863,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342863\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163224,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163224\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240658,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240658\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406953,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406953\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468358,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468358\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47374301675977654,\n\
\ \"acc_stderr\": 0.01669942767278476,\n \"acc_norm\": 0.47374301675977654,\n\
\ \"acc_norm_stderr\": 0.01669942767278476\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958143,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553697,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553697\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5152998776009792,\n\
\ \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6728962085128464,\n\
\ \"mc2_stderr\": 0.015201269488389688\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613981\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6360879454131918,\n \
\ \"acc_stderr\": 0.013252539227966185\n }\n}\n```"
repo_url: https://huggingface.co/ChaoticNeutrals/RPMix-4x7B-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|arc:challenge|25_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|arc:challenge|25_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|gsm8k|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|gsm8k|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hellaswag|10_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hellaswag|10_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-17-39.777657.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-19-12.877263.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-30T16-19-12.877263.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- '**/details_harness|winogrande|5_2024-03-30T16-17-39.777657.parquet'
- split: 2024_03_30T16_19_12.877263
path:
- '**/details_harness|winogrande|5_2024-03-30T16-19-12.877263.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-30T16-19-12.877263.parquet'
- config_name: results
data_files:
- split: 2024_03_30T16_17_39.777657
path:
- results_2024-03-30T16-17-39.777657.parquet
- split: 2024_03_30T16_19_12.877263
path:
- results_2024-03-30T16-19-12.877263.parquet
- split: latest
path:
- results_2024-03-30T16-19-12.877263.parquet
---
# Dataset Card for Evaluation run of ChaoticNeutrals/RPMix-4x7B-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChaoticNeutrals/RPMix-4x7B-MoE](https://huggingface.co/ChaoticNeutrals/RPMix-4x7B-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChaoticNeutrals__RPMix-4x7B-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-30T16:19:12.877263](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__RPMix-4x7B-MoE/blob/main/results_2024-03-30T16-19-12.877263.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6480977209038945,
"acc_stderr": 0.03214786369304485,
"acc_norm": 0.648718332669649,
"acc_norm_stderr": 0.03280218159776964,
"mc1": 0.5152998776009792,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.6728962085128464,
"mc2_stderr": 0.015201269488389688
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.013621696119173307,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.7120095598486357,
"acc_stderr": 0.0045190116884171695,
"acc_norm": 0.877912766381199,
"acc_norm_stderr": 0.0032671744584497567
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342863,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342863
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163224,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163224
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240658,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240658
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406953,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406953
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47374301675977654,
"acc_stderr": 0.01669942767278476,
"acc_norm": 0.47374301675977654,
"acc_norm_stderr": 0.01669942767278476
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897227,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897227
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553697,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553697
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5152998776009792,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.6728962085128464,
"mc2_stderr": 0.015201269488389688
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613981
},
"harness|gsm8k|5": {
"acc": 0.6360879454131918,
"acc_stderr": 0.013252539227966185
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Salvatale/test | Salvatale | "2024-03-30T20:07:48Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-30T20:07:42Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 20392
num_examples: 25
download_size: 19229
dataset_size: 20392
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jholst/test-upload | jholst | "2024-03-30T20:10:11Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-30T20:09:33Z" | ---
license: apache-2.0
---
|
joshwe/storiesdas | joshwe | "2024-03-30T21:33:35Z" | 30 | 0 | [
"format:parquet",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-30T20:41:19Z" | ---
dataset_info:
features:
- name: tokens
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 529
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mastherboy/brianmp3 | mastherboy | "2024-03-30T20:58:32Z" | 30 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-30T20:56:45Z" | ---
license: openrail
---
|
orpo-explorers/OpenHermesPreferences-50k | orpo-explorers | "2024-03-30T21:08:13Z" | 30 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-30T21:08:03Z" | ---
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: candidates_completions
sequence: string
- name: candidate_policies
sequence: string
- name: ranks
sequence: int64
- name: rank_str
dtype: string
- name: chosen_policy
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 367011761.8166934
num_examples: 50000
download_size: 183125043
dataset_size: 367011761.8166934
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_MaziyarPanahi__Topxtral-4x7B-v0.1 | open-llm-leaderboard-old | "2024-03-31T00:20:23Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-31T00:20:01Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/Topxtral-4x7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Topxtral-4x7B-v0.1](https://huggingface.co/MaziyarPanahi/Topxtral-4x7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Topxtral-4x7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T00:17:39.711118](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Topxtral-4x7B-v0.1/blob/main/results_2024-03-31T00-17-39.711118.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6554293011175572,\n\
\ \"acc_stderr\": 0.03197823221757451,\n \"acc_norm\": 0.6548903178735839,\n\
\ \"acc_norm_stderr\": 0.032644868359495864,\n \"mc1\": 0.5777233782129743,\n\
\ \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7337665152055244,\n\
\ \"mc2_stderr\": 0.014429693549028136\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6996587030716723,\n \"acc_stderr\": 0.013395909309957007,\n\
\ \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7024497112129058,\n\
\ \"acc_stderr\": 0.004562462665505233,\n \"acc_norm\": 0.8832901812387971,\n\
\ \"acc_norm_stderr\": 0.003204180072942374\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971118,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n\
\ \"acc_stderr\": 0.01664330737231587,\n \"acc_norm\": 0.45139664804469276,\n\
\ \"acc_norm_stderr\": 0.01664330737231587\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653349,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653349\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5777233782129743,\n\
\ \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7337665152055244,\n\
\ \"mc2_stderr\": 0.014429693549028136\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166737\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7172100075815011,\n \
\ \"acc_stderr\": 0.012405020417873615\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Topxtral-4x7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|arc:challenge|25_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|gsm8k|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hellaswag|10_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|winogrande|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T00-17-39.711118.parquet'
- config_name: results
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- results_2024-03-31T00-17-39.711118.parquet
- split: latest
path:
- results_2024-03-31T00-17-39.711118.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Topxtral-4x7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Topxtral-4x7B-v0.1](https://huggingface.co/MaziyarPanahi/Topxtral-4x7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Topxtral-4x7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T00:17:39.711118](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Topxtral-4x7B-v0.1/blob/main/results_2024-03-31T00-17-39.711118.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6554293011175572,
"acc_stderr": 0.03197823221757451,
"acc_norm": 0.6548903178735839,
"acc_norm_stderr": 0.032644868359495864,
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7337665152055244,
"mc2_stderr": 0.014429693549028136
},
"harness|arc:challenge|25": {
"acc": 0.6996587030716723,
"acc_stderr": 0.013395909309957007,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7024497112129058,
"acc_stderr": 0.004562462665505233,
"acc_norm": 0.8832901812387971,
"acc_norm_stderr": 0.003204180072942374
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971118,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45139664804469276,
"acc_stderr": 0.01664330737231587,
"acc_norm": 0.45139664804469276,
"acc_norm_stderr": 0.01664330737231587
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653349,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7337665152055244,
"mc2_stderr": 0.014429693549028136
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166737
},
"harness|gsm8k|5": {
"acc": 0.7172100075815011,
"acc_stderr": 0.012405020417873615
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
habibabderrahim/mini_platypus | habibabderrahim | "2024-03-31T00:52:36Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T00:50:18Z" | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 4178768
num_examples: 1000
download_size: 2226122
dataset_size: 4178768
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
atiranela/rodrigo | atiranela | "2024-03-31T00:58:39Z" | 30 | 0 | [
"license:openrail",
"region:us"
] | null | "2024-03-31T00:57:44Z" | ---
license: openrail
---
|
Brokzo/brokzodata | Brokzo | "2024-03-31T05:04:26Z" | 30 | 0 | [
"license:llama2",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T04:34:02Z" | ---
license: llama2
---
|
tyzhu/lmind_nq_train6000_eval6489_v1_reciteonly_qa_v3 | tyzhu | "2024-03-31T07:44:13Z" | 30 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T07:43:51Z" | ---
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 697367
num_examples: 6000
- name: train_ic_qa
num_bytes: 4540536
num_examples: 6000
- name: train_recite_qa
num_bytes: 4546536
num_examples: 6000
- name: eval_qa
num_bytes: 752802
num_examples: 6489
- name: eval_ic_qa
num_bytes: 4906186
num_examples: 6489
- name: eval_recite_qa
num_bytes: 4912675
num_examples: 6489
- name: all_docs
num_bytes: 7126313
num_examples: 10925
- name: all_docs_eval
num_bytes: 7125701
num_examples: 10925
- name: train
num_bytes: 3818906
num_examples: 6000
- name: validation
num_bytes: 4103798
num_examples: 6489
download_size: 26446330
dataset_size: 42530820
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_ic_qa
path: data/train_ic_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_ic_qa
path: data/eval_ic_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Ve11ichor/SAsong1.5KTraining | Ve11ichor | "2024-03-31T09:07:31Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T07:43:55Z" | ---
license: apache-2.0
---
|
presencesw/Llama_data_bad | presencesw | "2024-03-31T09:03:43Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T09:03:29Z" | ---
dataset_info:
features:
- name: topic
dtype: string
- name: Evidence
dtype: string
- name: predict
dtype: string
- name: Label
dtype: string
- name: Claim
dtype: string
- name: eval
dtype: int64
splits:
- name: train
num_bytes: 24278461
num_examples: 5781
download_size: 5443107
dataset_size: 24278461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
presencesw/Vistral_data_good | presencesw | "2024-03-31T09:04:37Z" | 30 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T09:04:25Z" | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: topic
dtype: string
- name: context
dtype: string
- name: Evidence
dtype: string
- name: Claim
dtype: string
- name: Label
dtype: string
- name: Explanation
dtype: string
- name: eval
dtype: float64
splits:
- name: train
num_bytes: 26729197
num_examples: 10824
download_size: 13553147
dataset_size: 26729197
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fewefWEGwg/llama | fewefWEGwg | "2024-03-31T09:05:59Z" | 30 | 0 | [
"license:mit",
"region:us"
] | null | "2024-03-31T09:05:57Z" | ---
license: mit
---
|
eswanYS/yeopan001 | eswanYS | "2024-03-31T10:50:04Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T10:49:24Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 392172
num_examples: 839
download_size: 194899
dataset_size: 392172
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fanfare71/testset01 | fanfare71 | "2024-04-01T06:41:35Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T10:57:42Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5132
num_examples: 30
download_size: 4057
dataset_size: 5132
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_0x7o__BulgakovLM-3B | open-llm-leaderboard-old | "2024-03-31T13:02:52Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-31T13:02:08Z" | ---
pretty_name: Evaluation run of 0x7o/BulgakovLM-3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [0x7o/BulgakovLM-3B](https://huggingface.co/0x7o/BulgakovLM-3B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0x7o__BulgakovLM-3B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T12:59:21.044155](https://huggingface.co/datasets/open-llm-leaderboard/details_0x7o__BulgakovLM-3B/blob/main/results_2024-03-31T12-59-21.044155.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2496314409468579,\n\
\ \"acc_stderr\": 0.030535520461275608,\n \"acc_norm\": 0.2507268314546236,\n\
\ \"acc_norm_stderr\": 0.031350677937381055,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080514,\n \"mc2\": 0.47927961355971765,\n\
\ \"mc2_stderr\": 0.016812004697595487\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22184300341296928,\n \"acc_stderr\": 0.012141659068147882,\n\
\ \"acc_norm\": 0.2832764505119454,\n \"acc_norm_stderr\": 0.013167478735134576\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2566221868153754,\n\
\ \"acc_stderr\": 0.004358764596401031,\n \"acc_norm\": 0.2656841266679944,\n\
\ \"acc_norm_stderr\": 0.0044079410588749625\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.0402477840197711,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.0402477840197711\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827842,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827842\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n\
\ \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n\
\ \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838742,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838742\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1349206349206349,\n\
\ \"acc_stderr\": 0.030557101589417515,\n \"acc_norm\": 0.1349206349206349,\n\
\ \"acc_norm_stderr\": 0.030557101589417515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3032258064516129,\n\
\ \"acc_stderr\": 0.026148685930671746,\n \"acc_norm\": 0.3032258064516129,\n\
\ \"acc_norm_stderr\": 0.026148685930671746\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178253,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178253\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.02075242372212801,\n \
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.02075242372212801\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380572,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380572\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343588,\n \"\
acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343588\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083291,\n \"\
acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083291\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460295,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460295\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.22869955156950672,\n\
\ \"acc_stderr\": 0.02818824004692919,\n \"acc_norm\": 0.22869955156950672,\n\
\ \"acc_norm_stderr\": 0.02818824004692919\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2037037037037037,\n\
\ \"acc_stderr\": 0.03893542518824849,\n \"acc_norm\": 0.2037037037037037,\n\
\ \"acc_norm_stderr\": 0.03893542518824849\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2541507024265645,\n\
\ \"acc_stderr\": 0.015569254692045773,\n \"acc_norm\": 0.2541507024265645,\n\
\ \"acc_norm_stderr\": 0.015569254692045773\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757177,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.01440029642922563,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.01440029642922563\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003472,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003472\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.238562091503268,\n \"acc_stderr\": 0.0172423858287796,\n \
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.0172423858287796\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724138,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724138\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.19900497512437812,\n\
\ \"acc_stderr\": 0.02823136509275841,\n \"acc_norm\": 0.19900497512437812,\n\
\ \"acc_norm_stderr\": 0.02823136509275841\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.0355092018568963,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.0355092018568963\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.03424042924691584,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.03424042924691584\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080514,\n \"mc2\": 0.47927961355971765,\n\
\ \"mc2_stderr\": 0.016812004697595487\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.014051745961790516\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/0x7o/BulgakovLM-3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|arc:challenge|25_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|gsm8k|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hellaswag|10_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T12-59-21.044155.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T12-59-21.044155.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- '**/details_harness|winogrande|5_2024-03-31T12-59-21.044155.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T12-59-21.044155.parquet'
- config_name: results
data_files:
- split: 2024_03_31T12_59_21.044155
path:
- results_2024-03-31T12-59-21.044155.parquet
- split: latest
path:
- results_2024-03-31T12-59-21.044155.parquet
---
# Dataset Card for Evaluation run of 0x7o/BulgakovLM-3B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0x7o/BulgakovLM-3B](https://huggingface.co/0x7o/BulgakovLM-3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0x7o__BulgakovLM-3B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T12:59:21.044155](https://huggingface.co/datasets/open-llm-leaderboard/details_0x7o__BulgakovLM-3B/blob/main/results_2024-03-31T12-59-21.044155.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2496314409468579,
"acc_stderr": 0.030535520461275608,
"acc_norm": 0.2507268314546236,
"acc_norm_stderr": 0.031350677937381055,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080514,
"mc2": 0.47927961355971765,
"mc2_stderr": 0.016812004697595487
},
"harness|arc:challenge|25": {
"acc": 0.22184300341296928,
"acc_stderr": 0.012141659068147882,
"acc_norm": 0.2832764505119454,
"acc_norm_stderr": 0.013167478735134576
},
"harness|hellaswag|10": {
"acc": 0.2566221868153754,
"acc_stderr": 0.004358764596401031,
"acc_norm": 0.2656841266679944,
"acc_norm_stderr": 0.0044079410588749625
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.0402477840197711,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.0402477840197711
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827842,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827842
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838742,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838742
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1349206349206349,
"acc_stderr": 0.030557101589417515,
"acc_norm": 0.1349206349206349,
"acc_norm_stderr": 0.030557101589417515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3032258064516129,
"acc_stderr": 0.026148685930671746,
"acc_norm": 0.3032258064516129,
"acc_norm_stderr": 0.026148685930671746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178253,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178253
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.02075242372212801,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.02075242372212801
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380572,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380572
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343588,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343588
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460295,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460295
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.22869955156950672,
"acc_stderr": 0.02818824004692919,
"acc_norm": 0.22869955156950672,
"acc_norm_stderr": 0.02818824004692919
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.03893542518824849,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.03893542518824849
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2541507024265645,
"acc_stderr": 0.015569254692045773,
"acc_norm": 0.2541507024265645,
"acc_norm_stderr": 0.015569254692045773
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757177,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.01440029642922563,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.01440029642922563
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003472,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.0172423858287796,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.0172423858287796
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724138,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724138
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.19900497512437812,
"acc_stderr": 0.02823136509275841,
"acc_norm": 0.19900497512437812,
"acc_norm_stderr": 0.02823136509275841
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.0355092018568963,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.0355092018568963
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080514,
"mc2": 0.47927961355971765,
"mc2_stderr": 0.016812004697595487
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.014051745961790516
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ricardo-larosa/SWE-bench_Lite_Dev_Extended | ricardo-larosa | "2024-03-31T16:27:14Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T16:27:06Z" | ---
dataset_info:
features:
- name: repo
dtype: string
- name: instance_id
dtype: string
- name: base_commit
dtype: string
- name: patch
dtype: string
- name: test_patch
dtype: string
- name: problem_statement
dtype: string
- name: hints_text
dtype: string
- name: created_at
dtype: string
- name: version
dtype: string
- name: FAIL_TO_PASS
dtype: string
- name: PASS_TO_PASS
dtype: string
- name: environment_setup_commit
dtype: string
- name: file_path
dtype: string
- name: file_content
dtype: string
- name: text
dtype: string
splits:
- name: dev
num_bytes: 2089768
num_examples: 23
download_size: 773748
dataset_size: 2089768
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
---
|
open-llm-leaderboard-old/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v2 | open-llm-leaderboard-old | "2024-03-31T16:33:34Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-31T16:33:08Z" | ---
pretty_name: Evaluation run of hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2](https://huggingface.co/hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T16:30:52.035194](https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v2/blob/main/results_2024-03-31T16-30-52.035194.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6375746321703655,\n\
\ \"acc_stderr\": 0.03225546197812389,\n \"acc_norm\": 0.6434618962614028,\n\
\ \"acc_norm_stderr\": 0.032904960223920136,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215137349816427,\n\
\ \"mc2_stderr\": 0.014137575959685471\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5691126279863481,\n \"acc_stderr\": 0.014471133392642476,\n\
\ \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946709\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.629555865365465,\n\
\ \"acc_stderr\": 0.004819367172685962,\n \"acc_norm\": 0.8330013941445927,\n\
\ \"acc_norm_stderr\": 0.0037221237096104645\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431385,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431385\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464074,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464074\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n\
\ \"acc_stderr\": 0.015624236160792579,\n \"acc_norm\": 0.3217877094972067,\n\
\ \"acc_norm_stderr\": 0.015624236160792579\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559806,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559806\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215137349816427,\n\
\ \"mc2_stderr\": 0.014137575959685471\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37907505686125853,\n \
\ \"acc_stderr\": 0.013363630295088347\n }\n}\n```"
repo_url: https://huggingface.co/hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|arc:challenge|25_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|gsm8k|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hellaswag|10_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T16-30-52.035194.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T16-30-52.035194.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- '**/details_harness|winogrande|5_2024-03-31T16-30-52.035194.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T16-30-52.035194.parquet'
- config_name: results
data_files:
- split: 2024_03_31T16_30_52.035194
path:
- results_2024-03-31T16-30-52.035194.parquet
- split: latest
path:
- results_2024-03-31T16-30-52.035194.parquet
---
# Dataset Card for Evaluation run of hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2](https://huggingface.co/hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T16:30:52.035194](https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v2/blob/main/results_2024-03-31T16-30-52.035194.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6375746321703655,
"acc_stderr": 0.03225546197812389,
"acc_norm": 0.6434618962614028,
"acc_norm_stderr": 0.032904960223920136,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215137349816427,
"mc2_stderr": 0.014137575959685471
},
"harness|arc:challenge|25": {
"acc": 0.5691126279863481,
"acc_stderr": 0.014471133392642476,
"acc_norm": 0.6006825938566553,
"acc_norm_stderr": 0.014312094557946709
},
"harness|hellaswag|10": {
"acc": 0.629555865365465,
"acc_stderr": 0.004819367172685962,
"acc_norm": 0.8330013941445927,
"acc_norm_stderr": 0.0037221237096104645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431385,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.033674621388960775,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.033674621388960775
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464074,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792579,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792579
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559806,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215137349816427,
"mc2_stderr": 0.014137575959685471
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140935
},
"harness|gsm8k|5": {
"acc": 0.37907505686125853,
"acc_stderr": 0.013363630295088347
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/pannibal_sukasuka | CyberHarem | "2024-03-31T16:40:38Z" | 30 | 0 | [
"task_categories:text-to-image",
"license:mit",
"size_categories:n<1K",
"library:datasets",
"library:mlcroissant",
"region:us",
"art",
"not-for-all-audiences"
] | [
"text-to-image"
] | "2024-03-31T16:36:25Z" | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Pannibal Nox Katena/パニバル・ノク・カテナ (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?)
This is the dataset of Pannibal Nox Katena/パニバル・ノク・カテナ (Shuumatsu Nani Shitemasu Ka? Isogashii Desu Ka?), containing 55 images and their tags.
The core tags of this character are `purple_hair, short_hair, purple_eyes, hair_over_one_eye`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 55 | 29.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pannibal_sukasuka/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 55 | 29.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pannibal_sukasuka/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 99 | 50.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pannibal_sukasuka/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pannibal_sukasuka',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, wooden_sword, holding_weapon, hoodie, outdoors, hood_down |
| 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 2girls, pink_hair, smile, open_mouth, hoodie |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | wooden_sword | holding_weapon | hoodie | outdoors | hood_down | 2girls | pink_hair | smile | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:-----------------|:---------|:-----------|:------------|:---------|:------------|:--------|:-------------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | | | | |
| 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | | | | | X | | | X | X | X | X |
|
wrekker/mental-health-chat | wrekker | "2024-03-31T18:11:43Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T18:11:16Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 58054.738275340394
num_examples: 462
- name: test
num_bytes: 25006.261724659606
num_examples: 199
download_size: 36540
dataset_size: 83061.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kristina-shemet/Training-set_GPT-answers_31_03 | kristina-shemet | "2024-03-31T18:29:33Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T18:29:29Z" | ---
dataset_info:
features:
- name: formatted_data
dtype: string
splits:
- name: train
num_bytes: 258774
num_examples: 372
download_size: 90620
dataset_size: 258774
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Paulitos/school-math-questions-llama2-5k | Paulitos | "2024-03-31T18:51:39Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T18:51:38Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2721591
num_examples: 5000
download_size: 1363904
dataset_size: 2721591
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_abhishek__autotrain-xva0j-mixtral8x7b | open-llm-leaderboard-old | "2024-03-31T21:14:01Z" | 30 | 0 | [
"region:us"
] | null | "2024-03-31T21:13:35Z" | ---
pretty_name: Evaluation run of abhishek/autotrain-xva0j-mixtral8x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhishek/autotrain-xva0j-mixtral8x7b](https://huggingface.co/abhishek/autotrain-xva0j-mixtral8x7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishek__autotrain-xva0j-mixtral8x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T21:11:14.229112](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__autotrain-xva0j-mixtral8x7b/blob/main/results_2024-03-31T21-11-14.229112.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6620050351391895,\n\
\ \"acc_stderr\": 0.031376062719000515,\n \"acc_norm\": 0.6748162944105474,\n\
\ \"acc_norm_stderr\": 0.032102797500686404,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.5012597449415515,\n\
\ \"mc2_stderr\": 0.01519301557942148\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5938566552901023,\n \"acc_stderr\": 0.014351656690097862,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844458\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6461860187213703,\n\
\ \"acc_stderr\": 0.004771751187407021,\n \"acc_norm\": 0.844353714399522,\n\
\ \"acc_norm_stderr\": 0.0036177879347477483\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110224,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110224\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7735849056603774,\n \"acc_stderr\": 0.02575755989310673,\n\
\ \"acc_norm\": 0.7735849056603774,\n \"acc_norm_stderr\": 0.02575755989310673\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n\
\ \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n\
\ \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.02559185776138218,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02559185776138218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"\
acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8282828282828283,\n \"acc_stderr\": 0.026869716187429903,\n \"\
acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.026869716187429903\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289736,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.02967090612463088,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.02967090612463088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7773109243697479,\n \"acc_stderr\": 0.027025433498882392,\n\
\ \"acc_norm\": 0.7773109243697479,\n \"acc_norm_stderr\": 0.027025433498882392\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.040428099613956346,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.040428099613956346\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163255,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163255\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8312236286919831,\n \"acc_stderr\": 0.024381406832586227,\n \
\ \"acc_norm\": 0.8312236286919831,\n \"acc_norm_stderr\": 0.024381406832586227\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572203,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.033932957297610096,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.033932957297610096\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n\
\ \"acc_stderr\": 0.017004368568132353,\n \"acc_norm\": 0.9273504273504274,\n\
\ \"acc_norm_stderr\": 0.017004368568132353\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.855683269476373,\n\
\ \"acc_stderr\": 0.012566417503320942,\n \"acc_norm\": 0.855683269476373,\n\
\ \"acc_norm_stderr\": 0.012566417503320942\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.016115235504865464,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.016115235504865464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912248,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912248\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.02378858355165855,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.02378858355165855\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.5104302477183833,\n \"acc_stderr\": 0.012767457253930655,\n\
\ \"acc_norm\": 0.5104302477183833,\n \"acc_norm_stderr\": 0.012767457253930655\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.7610294117647058,\n \"acc_stderr\": 0.025905280644893006,\n \"\
acc_norm\": 0.7610294117647058,\n \"acc_norm_stderr\": 0.025905280644893006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7009803921568627,\n \"acc_stderr\": 0.018521756215423024,\n \
\ \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.018521756215423024\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710905,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710905\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070824,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.5012597449415515,\n\
\ \"mc2_stderr\": 0.01519301557942148\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708272\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05458680818802123,\n \
\ \"acc_stderr\": 0.006257444037912528\n }\n}\n```"
repo_url: https://huggingface.co/abhishek/autotrain-xva0j-mixtral8x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|arc:challenge|25_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|gsm8k|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hellaswag|10_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T21-11-14.229112.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T21-11-14.229112.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- '**/details_harness|winogrande|5_2024-03-31T21-11-14.229112.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T21-11-14.229112.parquet'
- config_name: results
data_files:
- split: 2024_03_31T21_11_14.229112
path:
- results_2024-03-31T21-11-14.229112.parquet
- split: latest
path:
- results_2024-03-31T21-11-14.229112.parquet
---
# Dataset Card for Evaluation run of abhishek/autotrain-xva0j-mixtral8x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishek/autotrain-xva0j-mixtral8x7b](https://huggingface.co/abhishek/autotrain-xva0j-mixtral8x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishek__autotrain-xva0j-mixtral8x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T21:11:14.229112](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__autotrain-xva0j-mixtral8x7b/blob/main/results_2024-03-31T21-11-14.229112.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6620050351391895,
"acc_stderr": 0.031376062719000515,
"acc_norm": 0.6748162944105474,
"acc_norm_stderr": 0.032102797500686404,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.5012597449415515,
"mc2_stderr": 0.01519301557942148
},
"harness|arc:challenge|25": {
"acc": 0.5938566552901023,
"acc_stderr": 0.014351656690097862,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844458
},
"harness|hellaswag|10": {
"acc": 0.6461860187213703,
"acc_stderr": 0.004771751187407021,
"acc_norm": 0.844353714399522,
"acc_norm_stderr": 0.0036177879347477483
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110224,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110224
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7735849056603774,
"acc_stderr": 0.02575755989310673,
"acc_norm": 0.7735849056603774,
"acc_norm_stderr": 0.02575755989310673
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02559185776138218,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02559185776138218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.026869716187429903,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.026869716187429903
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289736,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.02967090612463088,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.02967090612463088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7773109243697479,
"acc_stderr": 0.027025433498882392,
"acc_norm": 0.7773109243697479,
"acc_norm_stderr": 0.027025433498882392
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.040428099613956346,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.040428099613956346
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163255,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807897,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8312236286919831,
"acc_stderr": 0.024381406832586227,
"acc_norm": 0.8312236286919831,
"acc_norm_stderr": 0.024381406832586227
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572203,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.033932957297610096,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.033932957297610096
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.017004368568132353,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.017004368568132353
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.855683269476373,
"acc_stderr": 0.012566417503320942,
"acc_norm": 0.855683269476373,
"acc_norm_stderr": 0.012566417503320942
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865464,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912248,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912248
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.02378858355165855,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.02378858355165855
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5104302477183833,
"acc_stderr": 0.012767457253930655,
"acc_norm": 0.5104302477183833,
"acc_norm_stderr": 0.012767457253930655
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7610294117647058,
"acc_stderr": 0.025905280644893006,
"acc_norm": 0.7610294117647058,
"acc_norm_stderr": 0.025905280644893006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.018521756215423024,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.018521756215423024
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710905,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710905
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070824,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.5012597449415515,
"mc2_stderr": 0.01519301557942148
},
"harness|winogrande|5": {
"acc": 0.7458563535911602,
"acc_stderr": 0.012236307219708272
},
"harness|gsm8k|5": {
"acc": 0.05458680818802123,
"acc_stderr": 0.006257444037912528
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jhamel/alpaca-chief-engineer-preliminary-design | jhamel | "2024-03-31T21:28:58Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-31T21:28:18Z" | ---
license: apache-2.0
---
|
MikeGreen2710/aux_v1444_test_split | MikeGreen2710 | "2024-04-02T00:21:24Z" | 30 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T03:21:33Z" | ---
dataset_info:
features:
- name: Word
dtype: string
- name: Tag
dtype: string
- name: 'Sentence #'
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 11741524
num_examples: 354320
download_size: 3837772
dataset_size: 11741524
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hereral/Clara-Training-Data | hereral | "2024-04-01T06:17:54Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T06:17:34Z" | ---
license: apache-2.0
---
|
kaleem11/blenderbot_v4 | kaleem11 | "2024-04-01T06:45:01Z" | 30 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T06:44:54Z" | ---
dataset_info:
features:
- name: text
sequence:
sequence: string
splits:
- name: train
num_bytes: 1080864342
num_examples: 9846
- name: test
num_bytes: 56864486
num_examples: 518
download_size: 1361816
dataset_size: 1137728828
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Soykot/Podder | Soykot | "2024-04-01T07:11:26Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T07:11:22Z" | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Saviourscs/Resarch_anjali | Saviourscs | "2024-04-01T07:46:31Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T07:40:13Z" | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 654808
num_examples: 460
download_size: 362324
dataset_size: 654808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
khoomeik/gzipscale-code-C-2.6M | khoomeik | "2024-04-01T08:07:26Z" | 30 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T07:47:56Z" | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 10387940
num_examples: 10105
download_size: 2682329
dataset_size: 10387940
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
khoomeik/gzipscale-code-html-256M | khoomeik | "2024-04-01T08:47:06Z" | 30 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T08:46:53Z" | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 1028001028
num_examples: 1000001
download_size: 263145842
dataset_size: 1028001028
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dzw/wudao | dzw | "2024-04-01T08:53:00Z" | 30 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-01T08:53:00Z" | ---
license: apache-2.0
---
|
Bharath182004/Flan_V2_processed | Bharath182004 | "2024-04-02T09:24:59Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T10:01:22Z" | ---
license: apache-2.0
---
|
pontusnorman123/swe_set2 | pontusnorman123 | "2024-04-01T12:00:16Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T11:13:49Z" | ---
dataset_info:
features:
- name: id
dtype: int64
- name: words
sequence: string
- name: bboxes
sequence:
sequence: float64
- name: ner_tags
sequence:
class_label:
names:
'0': I-COMPANY
'1': I-DATE
'2': I-ADDRESS
'3': I-TOTAL
'4': O
- name: image
dtype: image
splits:
- name: train
num_bytes: 382737550.0
num_examples: 249
- name: test
num_bytes: 53446678.0
num_examples: 50
download_size: 432809730
dataset_size: 436184228.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Trakshan/Fine_tuning_llama2_uding_openorca | Trakshan | "2024-04-01T11:36:37Z" | 30 | 0 | [
"license:mit",
"region:us"
] | null | "2024-04-01T11:26:42Z" | ---
license: mit
--- |
visionlab/block-towers-10k-3s-trajectory-scale1 | visionlab | "2024-04-01T11:44:04Z" | 30 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T11:37:50Z" | ---
dataset_info:
- config_name: stack3_stable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 6144000
num_examples: 8000
- name: test
num_bytes: 1536000
num_examples: 2000
download_size: 772415
dataset_size: 7680000
- config_name: stack3_unstable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 573216000
num_examples: 8000
- name: test
num_bytes: 143304000
num_examples: 2000
download_size: 357842807
dataset_size: 716520000
- config_name: stack4_stable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 7936000
num_examples: 8000
- name: test
num_bytes: 1984000
num_examples: 2000
download_size: 1082273
dataset_size: 9920000
- config_name: stack4_unstable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 746848000
num_examples: 8000
- name: test
num_bytes: 186712000
num_examples: 2000
download_size: 535206285
dataset_size: 933560000
- config_name: stack5_stable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 9728000
num_examples: 8000
- name: test
num_bytes: 2432000
num_examples: 2000
download_size: 1395431
dataset_size: 12160000
- config_name: stack5_unstable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 920480000
num_examples: 8000
- name: test
num_bytes: 230120000
num_examples: 2000
download_size: 704078782
dataset_size: 1150600000
- config_name: stack6_stable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 11520000
num_examples: 8000
- name: test
num_bytes: 2880000
num_examples: 2000
download_size: 1746742
dataset_size: 14400000
- config_name: stack6_unstable
features:
- name: data
struct:
- name: final_positions
list:
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: params
struct:
- name: duration
dtype: float64
- name: framerate
dtype: int64
- name: scale_factor
dtype: float64
- name: timestep
dtype: float64
- name: start_positions
list:
- name: density
dtype: 'null'
- name: lx
dtype: float64
- name: ly
dtype: float64
- name: lz
dtype: float64
- name: mass
dtype: 'null'
- name: rx
dtype: float64
- name: ry
dtype: float64
- name: rz
dtype: float64
- name: unstable
dtype: float64
- name: x
dtype: float64
- name: y
dtype: float64
- name: z
dtype: float64
- name: trajectory
list:
- name: data
list:
- name: id
dtype: int64
- name: name
dtype: string
- name: xmat
sequence: float64
- name: xyz
sequence: float64
- name: physics_step
dtype: int64
- name: t
dtype: float64
- name: video_frame
dtype: int64
- name: video_t
dtype: float64
- name: label
dtype: int64
- name: num_blocks
dtype: int64
splits:
- name: train
num_bytes: 1094112000
num_examples: 8000
- name: test
num_bytes: 273528000
num_examples: 2000
download_size: 877902271
dataset_size: 1367640000
configs:
- config_name: default
data_files:
- split: train
path: stack*/train-*
- split: test
path: stack*/test-*
- config_name: stack3
data_files:
- split: train
path: stack3*/train-*
- split: test
path: stack3*/test-*
- config_name: stack4
data_files:
- split: train
path: stack4*/train-*
- split: test
path: stack4*/test-*
- config_name: stack5
data_files:
- split: train
path: stack5*/train-*
- split: test
path: stack5*/test-*
- config_name: stack6
data_files:
- split: train
path: stack6*/train-*
- split: test
path: stack6*/test-*
- config_name: stack3_stable
data_files:
- split: train
path: stack3_stable/train-*
- split: test
path: stack3_stable/test-*
- config_name: stack3_unstable
data_files:
- split: train
path: stack3_unstable/train-*
- split: test
path: stack3_unstable/test-*
- config_name: stack4_stable
data_files:
- split: train
path: stack4_stable/train-*
- split: test
path: stack4_stable/test-*
- config_name: stack4_unstable
data_files:
- split: train
path: stack4_unstable/train-*
- split: test
path: stack4_unstable/test-*
- config_name: stack5_stable
data_files:
- split: train
path: stack5_stable/train-*
- split: test
path: stack5_stable/test-*
- config_name: stack5_unstable
data_files:
- split: train
path: stack5_unstable/train-*
- split: test
path: stack5_unstable/test-*
- config_name: stack6_stable
data_files:
- split: train
path: stack6_stable/train-*
- split: test
path: stack6_stable/test-*
- config_name: stack6_unstable
data_files:
- split: train
path: stack6_unstable/train-*
- split: test
path: stack6_unstable/test-*
---
|
gizemgg/wiki-eng-summary-trial-gen5-transformed-instruction | gizemgg | "2024-04-01T12:28:30Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T12:28:20Z" | ---
dataset_info:
features:
- name: doc
dtype: string
- name: summ
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 22962819
num_examples: 1640
- name: test
num_bytes: 5759165
num_examples: 410
download_size: 6418764
dataset_size: 28721984
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Boss9xy/tuan2 | Boss9xy | "2024-04-01T17:53:27Z" | 30 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-01T17:52:35Z" | ---
license: apache-2.0
---
|
LuizWr2/JanaDTBase | LuizWr2 | "2024-04-01T18:19:46Z" | 30 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-01T18:15:05Z" | ---
license: apache-2.0
---
|
ImanNalia/coraal_train_v2 | ImanNalia | "2024-04-01T21:17:02Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T21:04:49Z" | ---
dataset_info:
features:
- name: segment_filename
dtype: string
- name: text
dtype: string
- name: audio
struct:
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8632503027
num_examples: 11373
download_size: 8641855728
dataset_size: 8632503027
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
psaghafi/train-BIRD | psaghafi | "2024-04-01T22:03:20Z" | 30 | 0 | [
"license:cc-by-nc-4.0",
"size_categories:1K<n<10K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-01T21:11:05Z" | ---
license: cc-by-nc-4.0
---
|
FerasZawahreh/Employees_Reviews_For_Their_Company_DS | FerasZawahreh | "2024-04-01T23:02:42Z" | 30 | 0 | [
"task_categories:text-classification",
"language:en",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"text-classification"
] | "2024-04-01T22:39:19Z" | ---
task_categories:
- text-classification
language:
- en
size_categories:
- 10K<n<100K
--- |
TWT1019/cosmopedia | TWT1019 | "2024-04-16T09:01:03Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:100K<n<1M",
"format:text",
"modality:text",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-02T01:27:18Z" | ---
license: apache-2.0
---
|
Ediudo/STEVEM | Ediudo | "2024-04-02T03:45:46Z" | 30 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-02T03:45:20Z" | ---
license: openrail
---
|
coign/my_qa | coign | "2024-04-02T04:53:34Z" | 30 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-02T04:53:34Z" | ---
license: apache-2.0
---
|
yiching/MVTec_cable | yiching | "2024-04-02T06:35:36Z" | 30 | 0 | [
"license:unknown",
"size_categories:n<1K",
"format:imagefolder",
"modality:image",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-02T06:33:13Z" | ---
license: unknown
---
|
XYZ123XYZ/waste-classification-2 | XYZ123XYZ | "2024-04-02T07:03:59Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T07:03:21Z" | ---
license: apache-2.0
---
|
xiaqiunao/pokemon | xiaqiunao | "2024-04-02T08:15:00Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T07:36:54Z" | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 119417305.0
num_examples: 833
download_size: 99672356
dataset_size: 119417305.0
---
# Dataset Card for "pokemon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
McSpicyWithMilo/reference-element-move-cv | McSpicyWithMilo | "2024-04-02T11:00:16Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T11:00:09Z" | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: reference_element
dtype: string
splits:
- name: train
num_bytes: 12519
num_examples: 100
download_size: 7075
dataset_size: 12519
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "reference-element-move-cv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Annikaijak/neuro_patents_bds | Annikaijak | "2024-04-02T11:22:53Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T11:22:34Z" | ---
dataset_info:
features:
- name: appln_id
dtype: int64
- name: appln_filing_date
dtype: string
- name: docdb_family_id
dtype: int64
- name: granted
dtype: string
- name: appln_abstract
dtype: string
- name: appln_abstract_lg
dtype: string
- name: appln_title
dtype: string
- name: applt_coun
dtype: string
- name: invt_coun
dtype: string
- name: cpc
dtype: string
- name: ipc
sequence: string
- name: __index_level_0__
dtype: int64
- name: input
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 16068.5
num_examples: 7
download_size: 33478
dataset_size: 16068.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pontusnorman123/swe_set2_973_sroie | pontusnorman123 | "2024-04-02T14:20:21Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T11:34:55Z" | ---
dataset_info:
features:
- name: id
dtype: int64
- name: words
sequence: string
- name: bboxes
sequence:
sequence: float64
- name: ner_tags
sequence:
class_label:
names:
'0': I-COMPANY
'1': I-DATE
'2': I-ADDRESS
'3': I-TOTAL
'4': O
- name: image
dtype: image
splits:
- name: train
num_bytes: 1288055514.25
num_examples: 1222
- name: test
num_bytes: 53446678.0
num_examples: 50
download_size: 1321443890
dataset_size: 1341502192.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
j03x/CheckThat2023_Test | j03x | "2024-04-02T12:41:28Z" | 30 | 0 | [
"license:unknown",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T12:38:01Z" | ---
license: unknown
---
|
shrinivas1510/open_Orca_preprocessed | shrinivas1510 | "2024-04-03T10:23:57Z" | 30 | 0 | [
"license:mit",
"size_categories:10K<n<100K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T13:21:36Z" | ---
license: mit
---
|
domserrea/ebay_pd_ratings | domserrea | "2024-04-02T14:27:54Z" | 30 | 0 | [
"license:cc",
"size_categories:10K<n<100K",
"format:csv",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T14:27:16Z" | ---
license: cc
---
|
ppsaamuka/chavestv | ppsaamuka | "2024-04-02T16:01:18Z" | 30 | 0 | [
"license:openrail",
"region:us"
] | null | "2024-04-02T15:58:12Z" | ---
license: openrail
---
|
Mayank082000/Multilingual_Sentences_with_Sentences | Mayank082000 | "2024-04-02T16:19:46Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T16:14:18Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 509463
num_examples: 2289
download_size: 53713
dataset_size: 509463
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
joshwe/storiesdas2 | joshwe | "2024-04-02T16:22:08Z" | 30 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T16:21:49Z" | ---
dataset_info:
features:
- name: tokens
sequence: int64
splits:
- name: train
num_bytes: 45208540
num_examples: 11005
download_size: 7958523
dataset_size: 45208540
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thegreyhound/products | thegreyhound | "2024-04-02T16:32:22Z" | 30 | 0 | [
"license:unknown",
"size_categories:10K<n<100K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T16:30:56Z" | ---
license: unknown
---
|
juliakharchenko/Collectivist-Individualistic-Values | juliakharchenko | "2024-04-02T16:39:37Z" | 30 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-02T16:39:36Z" | ---
license: apache-2.0
---
|
open-llm-leaderboard-old/details_SF-Foundation__TextBase-v0.1 | open-llm-leaderboard-old | "2024-04-02T17:23:06Z" | 30 | 0 | [
"region:us"
] | null | "2024-04-02T17:22:25Z" | ---
pretty_name: Evaluation run of SF-Foundation/TextBase-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SF-Foundation/TextBase-v0.1](https://huggingface.co/SF-Foundation/TextBase-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SF-Foundation__TextBase-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T17:19:59.512400](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__TextBase-v0.1/blob/main/results_2024-04-02T17-19-59.512400.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6498963211070801,\n\
\ \"acc_stderr\": 0.03215214245410763,\n \"acc_norm\": 0.6493489041477648,\n\
\ \"acc_norm_stderr\": 0.03282317076177501,\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.01690969358024883,\n \"mc2\": 0.7725758135689857,\n\
\ \"mc2_stderr\": 0.013917260302952702\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403513,\n\
\ \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7343158733320055,\n\
\ \"acc_stderr\": 0.004407941058874968,\n \"acc_norm\": 0.8944433379804819,\n\
\ \"acc_norm_stderr\": 0.0030664137765701533\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642514,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642514\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n \"\
acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524572,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524572\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n\
\ \"acc_stderr\": 0.016542401954631917,\n \"acc_norm\": 0.42681564245810055,\n\
\ \"acc_norm_stderr\": 0.016542401954631917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6291309669522643,\n\
\ \"mc1_stderr\": 0.01690969358024883,\n \"mc2\": 0.7725758135689857,\n\
\ \"mc2_stderr\": 0.013917260302952702\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571776\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6633813495072024,\n \
\ \"acc_stderr\": 0.013016463679983369\n }\n}\n```"
repo_url: https://huggingface.co/SF-Foundation/TextBase-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|arc:challenge|25_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|gsm8k|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hellaswag|10_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-19-59.512400.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T17-19-59.512400.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- '**/details_harness|winogrande|5_2024-04-02T17-19-59.512400.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T17-19-59.512400.parquet'
- config_name: results
data_files:
- split: 2024_04_02T17_19_59.512400
path:
- results_2024-04-02T17-19-59.512400.parquet
- split: latest
path:
- results_2024-04-02T17-19-59.512400.parquet
---
# Dataset Card for Evaluation run of SF-Foundation/TextBase-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SF-Foundation/TextBase-v0.1](https://huggingface.co/SF-Foundation/TextBase-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SF-Foundation__TextBase-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T17:19:59.512400](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__TextBase-v0.1/blob/main/results_2024-04-02T17-19-59.512400.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6498963211070801,
"acc_stderr": 0.03215214245410763,
"acc_norm": 0.6493489041477648,
"acc_norm_stderr": 0.03282317076177501,
"mc1": 0.6291309669522643,
"mc1_stderr": 0.01690969358024883,
"mc2": 0.7725758135689857,
"mc2_stderr": 0.013917260302952702
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403513,
"acc_norm": 0.7278156996587031,
"acc_norm_stderr": 0.013006600406423702
},
"harness|hellaswag|10": {
"acc": 0.7343158733320055,
"acc_stderr": 0.004407941058874968,
"acc_norm": 0.8944433379804819,
"acc_norm_stderr": 0.0030664137765701533
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524572,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524572
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.016542401954631917,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.016542401954631917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6291309669522643,
"mc1_stderr": 0.01690969358024883,
"mc2": 0.7725758135689857,
"mc2_stderr": 0.013917260302952702
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571776
},
"harness|gsm8k|5": {
"acc": 0.6633813495072024,
"acc_stderr": 0.013016463679983369
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mperez28/wwco-teammates | mperez28 | "2024-04-02T19:33:59Z" | 30 | 0 | [
"license:afl-3.0",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T18:36:15Z" | ---
license: afl-3.0
---
|
hectoritr/FoodDataset | hectoritr | "2024-04-02T21:37:28Z" | 30 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T18:37:14Z" | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 12397835.0
num_examples: 268
download_size: 12036087
dataset_size: 12397835.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AbhiKrov/basic_translation_model_1 | AbhiKrov | "2024-04-02T20:04:06Z" | 30 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T20:03:48Z" | ---
dataset_info:
features:
- name: id
sequence: int64
- name: translation
struct:
- name: en
dtype: string
- name: hin
dtype: string
splits:
- name: train
num_bytes: 38469481
num_examples: 24878
- name: test
num_bytes: 9625304
num_examples: 6220
download_size: 7597459
dataset_size: 48094785
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
pontusnorman123/swe_set2_973_sroie_with_50_sroietest | pontusnorman123 | "2024-04-02T20:18:31Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-02T20:16:32Z" | ---
dataset_info:
features:
- name: id
dtype: int64
- name: words
sequence: string
- name: bboxes
sequence:
sequence: float64
- name: ner_tags
sequence:
class_label:
names:
'0': I-COMPANY
'1': I-DATE
'2': I-ADDRESS
'3': I-TOTAL
'4': O
- name: image
dtype: image
splits:
- name: train
num_bytes: 1238686663.5
num_examples: 1172
- name: test
num_bytes: 102815529.0
num_examples: 100
download_size: 1321418708
dataset_size: 1341502192.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard-old/details_Gille__StrangeMerges_47-7B-dare_ties | open-llm-leaderboard-old | "2024-04-02T20:34:25Z" | 30 | 0 | [
"region:us"
] | null | "2024-04-02T20:33:14Z" | ---
pretty_name: Evaluation run of Gille/StrangeMerges_47-7B-dare_ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_47-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_47-7B-dare_ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_47-7B-dare_ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T20:30:41.647453](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_47-7B-dare_ties/blob/main/results_2024-04-02T20-30-41.647453.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6367859829101828,\n\
\ \"acc_stderr\": 0.032488134334004146,\n \"acc_norm\": 0.6377335973539087,\n\
\ \"acc_norm_stderr\": 0.03315290807892043,\n \"mc1\": 0.5128518971848225,\n\
\ \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6785725906165029,\n\
\ \"mc2_stderr\": 0.014784490269410245\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902272,\n\
\ \"acc_norm\": 0.6945392491467577,\n \"acc_norm_stderr\": 0.013460080478002508\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6742680740888269,\n\
\ \"acc_stderr\": 0.0046768988619789115,\n \"acc_norm\": 0.8668591913961362,\n\
\ \"acc_norm_stderr\": 0.0033903254580202576\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.037242495958177295,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.037242495958177295\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497586,\n \
\ \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497586\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413925,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413925\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643526,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643526\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794087,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794087\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128136,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128136\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381387,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381387\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247326,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247326\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n\
\ \"acc_stderr\": 0.01593748465668703,\n \"acc_norm\": 0.3486033519553073,\n\
\ \"acc_norm_stderr\": 0.01593748465668703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818756,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818756\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4556714471968709,\n \"acc_stderr\": 0.012719949543032205,\n\
\ \"acc_norm\": 0.4556714471968709,\n \"acc_norm_stderr\": 0.012719949543032205\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n \"\
acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6519607843137255,\n \"acc_stderr\": 0.019270998708223977,\n \
\ \"acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.019270998708223977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5128518971848225,\n\
\ \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6785725906165029,\n\
\ \"mc2_stderr\": 0.014784490269410245\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359238\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \
\ \"acc_stderr\": 0.01337397127772981\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_47-7B-dare_ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|arc:challenge|25_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|gsm8k|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hellaswag|10_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-30-41.647453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T20-30-41.647453.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- '**/details_harness|winogrande|5_2024-04-02T20-30-41.647453.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T20-30-41.647453.parquet'
- config_name: results
data_files:
- split: 2024_04_02T20_30_41.647453
path:
- results_2024-04-02T20-30-41.647453.parquet
- split: latest
path:
- results_2024-04-02T20-30-41.647453.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_47-7B-dare_ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_47-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_47-7B-dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_47-7B-dare_ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T20:30:41.647453](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_47-7B-dare_ties/blob/main/results_2024-04-02T20-30-41.647453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6367859829101828,
"acc_stderr": 0.032488134334004146,
"acc_norm": 0.6377335973539087,
"acc_norm_stderr": 0.03315290807892043,
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6785725906165029,
"mc2_stderr": 0.014784490269410245
},
"harness|arc:challenge|25": {
"acc": 0.6629692832764505,
"acc_stderr": 0.013813476652902272,
"acc_norm": 0.6945392491467577,
"acc_norm_stderr": 0.013460080478002508
},
"harness|hellaswag|10": {
"acc": 0.6742680740888269,
"acc_stderr": 0.0046768988619789115,
"acc_norm": 0.8668591913961362,
"acc_norm_stderr": 0.0033903254580202576
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.037242495958177295,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.037242495958177295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3814814814814815,
"acc_stderr": 0.029616718927497586,
"acc_norm": 0.3814814814814815,
"acc_norm_stderr": 0.029616718927497586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413925,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413925
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643526,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643526
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794087,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128136,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381387,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381387
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247326,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247326
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.01593748465668703,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.01593748465668703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.026716118380156847,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.026716118380156847
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818756,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818756
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032205,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032205
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.019270998708223977,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.019270998708223977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5128518971848225,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6785725906165029,
"mc2_stderr": 0.014784490269410245
},
"harness|winogrande|5": {
"acc": 0.8224151539068666,
"acc_stderr": 0.010740676861359238
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.01337397127772981
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
saifsre/aez | saifsre | "2024-04-02T22:48:20Z" | 30 | 0 | [
"license:mit",
"region:us"
] | null | "2024-04-02T22:48:20Z" | ---
license: mit
---
|
adamo1139/toxic-dpo-natural-v1 | adamo1139 | "2024-04-03T23:52:12Z" | 30 | 0 | [
"license:other",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"not-for-all-audiences"
] | null | "2024-04-02T23:20:09Z" | ---
license: other
license_name: other
license_link: LICENSE
tags:
- not-for-all-audiences
---
I wanted to improve on unalignment/toxic-dpo-v0.1 by having more natural-sounding responses and less numbered lists. \
I used original prompts plus added around 40 unique new prompts related to medicine.
Issues:
<b>there are some prompts that result in the model saying silly braindead things like adding up sugar blood level to blood pressure or going into thesaurus mode, will need fixes.</b>
IDs of prompts are not unique, bear that in mind. |
JC08/jauua | JC08 | "2024-04-03T00:10:01Z" | 30 | 0 | [
"license:ms-pl",
"region:us"
] | null | "2024-04-03T00:10:01Z" | ---
license: ms-pl
---
|
Paulitos/school-math-questions-llama2-pt-br | Paulitos | "2024-04-03T01:30:20Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-03T01:30:12Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5198707
num_examples: 8792
download_size: 2587864
dataset_size: 5198707
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_MaziyarPanahi__Calme-7B-Instruct-v0.9 | open-llm-leaderboard-old | "2024-04-03T01:58:51Z" | 30 | 0 | [
"region:us"
] | null | "2024-04-03T01:58:24Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.9
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Calme-7B-Instruct-v0.9](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.9)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.9\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T01:56:03.277524](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.9/blob/main/results_2024-04-03T01-56-03.277524.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6503631508862739,\n\
\ \"acc_stderr\": 0.03201477624635291,\n \"acc_norm\": 0.6494426177373886,\n\
\ \"acc_norm_stderr\": 0.032687716003509455,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7778831072757934,\n\
\ \"mc2_stderr\": 0.013722896048139208\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n\
\ \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.012928933196496363\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7171878111929895,\n\
\ \"acc_stderr\": 0.004494454911844618,\n \"acc_norm\": 0.8913563035251942,\n\
\ \"acc_norm_stderr\": 0.00310555663173939\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.016536829648997105,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.016536829648997105\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n\
\ \"acc_stderr\": 0.012759117066518015,\n \"acc_norm\": 0.4791395045632334,\n\
\ \"acc_norm_stderr\": 0.012759117066518015\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7778831072757934,\n\
\ \"mc2_stderr\": 0.013722896048139208\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184136\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6997725549658832,\n \
\ \"acc_stderr\": 0.012625423152283037\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.9
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|arc:challenge|25_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|gsm8k|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hellaswag|10_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-56-03.277524.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T01-56-03.277524.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- '**/details_harness|winogrande|5_2024-04-03T01-56-03.277524.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T01-56-03.277524.parquet'
- config_name: results
data_files:
- split: 2024_04_03T01_56_03.277524
path:
- results_2024-04-03T01-56-03.277524.parquet
- split: latest
path:
- results_2024-04-03T01-56-03.277524.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.9
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Calme-7B-Instruct-v0.9](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.9",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T01:56:03.277524](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.9/blob/main/results_2024-04-03T01-56-03.277524.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6503631508862739,
"acc_stderr": 0.03201477624635291,
"acc_norm": 0.6494426177373886,
"acc_norm_stderr": 0.032687716003509455,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7778831072757934,
"mc2_stderr": 0.013722896048139208
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393441,
"acc_norm": 0.7329351535836177,
"acc_norm_stderr": 0.012928933196496363
},
"harness|hellaswag|10": {
"acc": 0.7171878111929895,
"acc_stderr": 0.004494454911844618,
"acc_norm": 0.8913563035251942,
"acc_norm_stderr": 0.00310555663173939
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.016536829648997105,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.016536829648997105
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.012759117066518015,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.012759117066518015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7778831072757934,
"mc2_stderr": 0.013722896048139208
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.009990706005184136
},
"harness|gsm8k|5": {
"acc": 0.6997725549658832,
"acc_stderr": 0.012625423152283037
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jinshan123/jin_model_data | jinshan123 | "2024-04-03T02:30:31Z" | 30 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-03T02:30:31Z" | ---
license: apache-2.0
---
|
shahxeebhassan/UrduAssistant-llama2 | shahxeebhassan | "2024-04-03T03:19:23Z" | 30 | 0 | [
"license:mit",
"region:us"
] | null | "2024-04-03T02:34:35Z" | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 68641646
num_examples: 67017
download_size: 32562092
dataset_size: 68641646
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_ChuckMcSneed__ArcaneEntanglement-model64-70b | open-llm-leaderboard-old | "2024-04-03T04:03:45Z" | 30 | 0 | [
"region:us"
] | null | "2024-04-03T04:03:16Z" | ---
pretty_name: Evaluation run of ChuckMcSneed/ArcaneEntanglement-model64-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChuckMcSneed/ArcaneEntanglement-model64-70b](https://huggingface.co/ChuckMcSneed/ArcaneEntanglement-model64-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChuckMcSneed__ArcaneEntanglement-model64-70b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T04:00:35.269835](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__ArcaneEntanglement-model64-70b/blob/main/results_2024-04-03T04-00-35.269835.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7081319875868494,\n\
\ \"acc_stderr\": 0.03007989681657682,\n \"acc_norm\": 0.7112822557850792,\n\
\ \"acc_norm_stderr\": 0.030663388669225966,\n \"mc1\": 0.4430844553243574,\n\
\ \"mc1_stderr\": 0.017389730346877106,\n \"mc2\": 0.6052983910894114,\n\
\ \"mc2_stderr\": 0.01490057109922886\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.7141638225255973,\n \"acc_norm_stderr\": 0.01320319608853737\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6931886078470424,\n\
\ \"acc_stderr\": 0.004602279238122068,\n \"acc_norm\": 0.8796056562437762,\n\
\ \"acc_norm_stderr\": 0.003247570330456916\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47354497354497355,\n \"acc_stderr\": 0.02571523981134676,\n \"\
acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.02571523981134676\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\
\ \"acc_stderr\": 0.021732540689329286,\n \"acc_norm\": 0.8225806451612904,\n\
\ \"acc_norm_stderr\": 0.021732540689329286\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.03481904844438804,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.03481904844438804\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528437,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528437\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8838383838383839,\n \"acc_stderr\": 0.02282888177524938,\n \"\
acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.02282888177524938\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7153846153846154,\n \"acc_stderr\": 0.0228783227997063,\n \
\ \"acc_norm\": 0.7153846153846154,\n \"acc_norm_stderr\": 0.0228783227997063\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295893,\n \
\ \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295893\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9064220183486239,\n \"acc_stderr\": 0.012486841824601963,\n \"\
acc_norm\": 0.9064220183486239,\n \"acc_norm_stderr\": 0.012486841824601963\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813902,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813902\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8251121076233184,\n\
\ \"acc_stderr\": 0.02549528462644497,\n \"acc_norm\": 0.8251121076233184,\n\
\ \"acc_norm_stderr\": 0.02549528462644497\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.018315891685625835,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.018315891685625835\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8710089399744572,\n\
\ \"acc_stderr\": 0.011986371548086858,\n \"acc_norm\": 0.8710089399744572,\n\
\ \"acc_norm_stderr\": 0.011986371548086858\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.021029269752423214,\n\
\ \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.021029269752423214\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.016384638410380816,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.016384638410380816\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.024404394928087873,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.024404394928087873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7781350482315113,\n\
\ \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.7781350482315113,\n\
\ \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.02058146613825711,\n\
\ \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.02058146613825711\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5815602836879432,\n \"acc_stderr\": 0.02942799403942,\n \"\
acc_norm\": 0.5815602836879432,\n \"acc_norm_stderr\": 0.02942799403942\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5554106910039114,\n\
\ \"acc_stderr\": 0.012691575792657112,\n \"acc_norm\": 0.5554106910039114,\n\
\ \"acc_norm_stderr\": 0.012691575792657112\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789524,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789524\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7696078431372549,\n \"acc_stderr\": 0.017035229258034034,\n \
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.017035229258034034\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007636,\n\
\ \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007636\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4430844553243574,\n\
\ \"mc1_stderr\": 0.017389730346877106,\n \"mc2\": 0.6052983910894114,\n\
\ \"mc2_stderr\": 0.01490057109922886\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8303078137332282,\n \"acc_stderr\": 0.010549542647363682\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6300227445034117,\n \
\ \"acc_stderr\": 0.013298661207727124\n }\n}\n```"
repo_url: https://huggingface.co/ChuckMcSneed/ArcaneEntanglement-model64-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: [email protected]
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|arc:challenge|25_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|gsm8k|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hellaswag|10_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T04-00-35.269835.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T04-00-35.269835.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- '**/details_harness|winogrande|5_2024-04-03T04-00-35.269835.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T04-00-35.269835.parquet'
- config_name: results
data_files:
- split: 2024_04_03T04_00_35.269835
path:
- results_2024-04-03T04-00-35.269835.parquet
- split: latest
path:
- results_2024-04-03T04-00-35.269835.parquet
---
# Dataset Card for Evaluation run of ChuckMcSneed/ArcaneEntanglement-model64-70b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChuckMcSneed/ArcaneEntanglement-model64-70b](https://huggingface.co/ChuckMcSneed/ArcaneEntanglement-model64-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChuckMcSneed__ArcaneEntanglement-model64-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T04:00:35.269835](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__ArcaneEntanglement-model64-70b/blob/main/results_2024-04-03T04-00-35.269835.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7081319875868494,
"acc_stderr": 0.03007989681657682,
"acc_norm": 0.7112822557850792,
"acc_norm_stderr": 0.030663388669225966,
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877106,
"mc2": 0.6052983910894114,
"mc2_stderr": 0.01490057109922886
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.7141638225255973,
"acc_norm_stderr": 0.01320319608853737
},
"harness|hellaswag|10": {
"acc": 0.6931886078470424,
"acc_stderr": 0.004602279238122068,
"acc_norm": 0.8796056562437762,
"acc_norm_stderr": 0.003247570330456916
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.02571523981134676,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.02571523981134676
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528437,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528437
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.02282888177524938,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.02282888177524938
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7153846153846154,
"acc_stderr": 0.0228783227997063,
"acc_norm": 0.7153846153846154,
"acc_norm_stderr": 0.0228783227997063
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295893,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295893
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9064220183486239,
"acc_stderr": 0.012486841824601963,
"acc_norm": 0.9064220183486239,
"acc_norm_stderr": 0.012486841824601963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813902,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813902
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065498,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065498
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8251121076233184,
"acc_stderr": 0.02549528462644497,
"acc_norm": 0.8251121076233184,
"acc_norm_stderr": 0.02549528462644497
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625835,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625835
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8710089399744572,
"acc_stderr": 0.011986371548086858,
"acc_norm": 0.8710089399744572,
"acc_norm_stderr": 0.011986371548086858
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8121387283236994,
"acc_stderr": 0.021029269752423214,
"acc_norm": 0.8121387283236994,
"acc_norm_stderr": 0.021029269752423214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6,
"acc_stderr": 0.016384638410380816,
"acc_norm": 0.6,
"acc_norm_stderr": 0.016384638410380816
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.024404394928087873,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.024404394928087873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7781350482315113,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.7781350482315113,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.02058146613825711,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.02058146613825711
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5815602836879432,
"acc_stderr": 0.02942799403942,
"acc_norm": 0.5815602836879432,
"acc_norm_stderr": 0.02942799403942
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5554106910039114,
"acc_stderr": 0.012691575792657112,
"acc_norm": 0.5554106910039114,
"acc_norm_stderr": 0.012691575792657112
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789524,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789524
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.017035229258034034,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.017035229258034034
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007636,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007636
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700637,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700637
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4430844553243574,
"mc1_stderr": 0.017389730346877106,
"mc2": 0.6052983910894114,
"mc2_stderr": 0.01490057109922886
},
"harness|winogrande|5": {
"acc": 0.8303078137332282,
"acc_stderr": 0.010549542647363682
},
"harness|gsm8k|5": {
"acc": 0.6300227445034117,
"acc_stderr": 0.013298661207727124
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
aaronmorilloamoba7/dataset-test-2 | aaronmorilloamoba7 | "2024-04-03T04:52:18Z" | 30 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-03T04:52:18Z" | ---
license: apache-2.0
---
|
Supreeta03/CREMA-melSpecImages | Supreeta03 | "2024-04-03T05:27:22Z" | 30 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:image",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-03T05:25:16Z" | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Anger
'1': Happy
'2': Fear
'3': Sad
'4': Disgust
'5': Neutral
splits:
- name: train
num_bytes: 365733343.75
num_examples: 7442
download_size: 365604204
dataset_size: 365733343.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maverickrzw/tic_tac_toe_100k | maverickrzw | "2024-04-22T07:51:38Z" | 30 | 0 | [
"license:apache-2.0",
"size_categories:100K<n<1M",
"format:webdataset",
"modality:image",
"modality:text",
"library:datasets",
"library:webdataset",
"library:mlcroissant",
"region:us"
] | null | "2024-04-03T05:26:46Z" | ---
license: apache-2.0
---
|
jlipe/playing_cards | jlipe | "2024-04-09T01:54:04Z" | 30 | 0 | [
"region:us"
] | null | "2024-04-03T05:38:25Z" | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1033364793.36
num_examples: 1155
download_size: 1018166630
dataset_size: 1033364793.36
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BrahimLtr/IRONTVMAX | BrahimLtr | "2024-04-03T06:36:16Z" | 30 | 0 | [
"license:afl-3.0",
"region:us"
] | null | "2024-04-03T06:36:15Z" | ---
license: afl-3.0
---
|
projectbaraat/kan-eng-Mathematical-0.1 | projectbaraat | "2024-04-03T06:46:52Z" | 30 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-03T06:46:32Z" | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 450861385
num_examples: 337092
download_size: 155499390
dataset_size: 450861385
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|