Muennighoff
commited on
Scheduled Commit
Browse files
data/retrieval_battle-928d45d9-19ab-4d05-9aae-4b6574886d0d.jsonl
CHANGED
@@ -79,3 +79,4 @@
|
|
79 |
{"tstamp": 1722575021.6602, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "26338595df2d4faf807e22811bc1b1f0", "0_model_name": "BAAI/bge-large-en-v1.5", "0_prompt": "dialog systems that use embeddings, for LLM-based chatbots", "0_output": [["dialog systems that use embeddings, for LLM-based chatbots", "Large language model\n\nA large language model (LLM) is a computational model notable for its ability to achieve general-purpose language generation and other natural language processing tasks such as classification. Based on language models, LLMs acquire these abilities by learning statistical relationships from vast amounts of text during a computationally intensive self-supervised and semi-supervised training process. LLMs can be used for text generation, a form of generative AI, by taking an input text and repeatedly predicting the next token or word.\nLLMs are artificial neural networks that utilize the transformer architecture, invented in 2017. The largest and most capable LLMs, , are built with a decoder-only transformer-based architecture, which enables efficient processing and generation of large-scale text data."]], "0_corpus": "wikipedia", "1_conv_id": "314b01d92c5c4dc4b512193270b12595", "1_model_name": "intfloat/e5-mistral-7b-instruct", "1_prompt": "dialog systems that use embeddings, for LLM-based chatbots", "1_output": [["dialog systems that use embeddings, for LLM-based chatbots", "Large language model\n\nA large language model (LLM) is a computational model notable for its ability to achieve general-purpose language generation and other natural language processing tasks such as classification. Based on language models, LLMs acquire these abilities by learning statistical relationships from vast amounts of text during a computationally intensive self-supervised and semi-supervised training process. LLMs can be used for text generation, a form of generative AI, by taking an input text and repeatedly predicting the next token or word.\nLLMs are artificial neural networks that utilize the transformer architecture, invented in 2017. The largest and most capable LLMs, , are built with a decoder-only transformer-based architecture, which enables efficient processing and generation of large-scale text data."]], "1_corpus": "wikipedia"}
|
80 |
{"tstamp": 1722575077.532, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "9245f99d83c74c4d9a4dbf5b87dc5535", "0_model_name": "intfloat/e5-mistral-7b-instruct", "0_prompt": "dialog systems that use embeddings, for LLM-based chatbots", "0_output": [["dialog systems that use embeddings, for LLM-based chatbots", "Title: Exploring and Characterizing Large Language Models For Embedded System Development and Debugging\n\nAbstract: Large language models (LLMs) have shown remarkable abilities to generate code, however their ability to develop software for embedded systems, which requires cross-domain knowledge of hardware and software has not been studied. In this paper we develop an extensible, open source hardware-in-the-loop framework to systematically evaluate leading LLMs (GPT-3.5, GPT-4, PaLM 2) to assess their capabilities and limitations for embedded system development. We observe through our study that even when these tools fail to produce working code, they consistently generate helpful reasoning about embedded design tasks. We leverage this finding to study how human programmers interact with these tools, and develop an human-AI based software engineering workflow for building embedded systems. Our evaluation platform for verifying LLM generated programs uses sensor actuator pairs for physical evaluation. We compare all three models with N=450 experiments and find surprisingly that GPT-4 especially shows an exceptional level of cross-domain understanding and reasoning, in some cases generating fully correct programs from a single prompt. In N=50 trials, GPT-4 produces functional I2C interfaces 66% of the time. GPT-4 also produces register-level drivers, code for LoRa communication, and context-specific power optimizations for an nRF52 program resulting in over 740x current reduction to 12.2uA. We also characterize the models' limitations to develop a generalizable human-AI workflow for using LLMs in embedded system development. We evaluate our workflow with 15 users including novice and expert programmers. We find that our workflow improves productivity for all users and increases the success rate for building a LoRa environmental sensor from 25% to 100%, including for users with zero hardware or C/C++ experience."]], "0_corpus": "arxiv", "1_conv_id": "61900cd85b4241d8a19119f6f1b16464", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "dialog systems that use embeddings, for LLM-based chatbots", "1_output": [["dialog systems that use embeddings, for LLM-based chatbots", "Title: Domain-specific ChatBots for Science using Embeddings\n\nAbstract: Large language models (LLMs) have emerged as powerful machine-learning systems capable of handling a myriad of tasks. Tuned versions of these systems have been turned into chatbots that can respond to user queries on a vast diversity of topics, providing informative and creative replies. However, their application to physical science research remains limited owing to their incomplete knowledge in these areas, contrasted with the needs of rigor and sourcing in science domains. Here, we demonstrate how existing methods and software tools can be easily combined to yield a domain-specific chatbot. The system ingests scientific documents in existing formats, and uses text embedding lookup to provide the LLM with domain-specific contextual information when composing its reply. We similarly demonstrate that existing image embedding methods can be used for search and retrieval across publication figures. These results confirm that LLMs are already suitable for use by physical scientists in accelerating their research efforts."]], "1_corpus": "arxiv"}
|
81 |
{"tstamp": 1722579503.8059, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "c6a86245ccb44e83a15f4767cef256fc", "0_model_name": "voyage-multilingual-2", "0_prompt": "who helped them recapture mycenae once they were old enough to fight", "0_output": [["who helped them recapture mycenae once they were old enough to fight", "Mycenae\n\nAtreid dynasty The people of Mycenae had received advice from an oracle that they should choose a new king from among the Pelopids. The two contenders were Atreus and his brother, Thyestes. The latter was chosen at first. At this moment nature intervened and the sun appeared to reverse direction by setting in the east. Atreus argued that because the sun had reversed its path, the election of Thyestes should be reversed. The argument was heeded, and Atreus became king. His first move was to pursue Thyestes and all his family – that is, his own kin – but Thyestes managed to escape from Mycenae.\nIn legend, Atreus had two sons, Agamemnon and Menelaus, the Atreids. Aegisthus, the son of Thyestes, killed Atreus and restored Thyestes to the throne. With the help of King Tyndareus of Sparta, the Atreids drove Thyestes again into exile. Tyndareus had two ill-starred daughters, Helen and Clytemnestra, whom Menelaus and Agamemnon married, respectively. Agamemnon inherited Mycenae and Menelaus became king of Sparta.\nHomeric Poems\nSoon, Helen eloped with Paris of Troy. Agamemnon conducted a 10-year war against Troy to get her back for his brother. Because of lack of wind, the warships could not sail to Troy. In order to please the gods so that they might make the winds start to blow, Agamemnon sacrificed his daughter Iphigenia. According to some versions of the legend, the hunting goddess Artemis replaced her at the very last moment with a deer on the altar, and took Iphigenia to Tauris (see Iphigenia in Tauris by Euripides). The deities, having been satisfied by such a sacrifice, made the winds blow and the Greek fleet departed."]], "0_corpus": "wikipedia", "1_conv_id": "732db75064584027960d547439a205ca", "1_model_name": "text-embedding-004", "1_prompt": "who helped them recapture mycenae once they were old enough to fight", "1_output": [["who helped them recapture mycenae once they were old enough to fight", "Mycenae\n\nLegend tells us that the long and arduous Trojan War, although nominally a Greek victory, brought anarchy, piracy, and ruin; already before the Greek fleet set sail for Troy, the conflict had divided the gods as well, and this contributed to curses and acts of vengeance following many of the Greek heroes. After the war Agamemnon returned to Mycenae and was greeted royally with a red carpet rolled out for him. Shortly thereafter, he was slain by Clytemnestra, who hated him bitterly for having ordered the sacrifice of their daughter Iphigenia in order to gain favorable winds to Troy. Clytemnestra was aided in her crime by Aegistheus, her lover, who reigned subsequently, but Orestes, her son by Agamemnon, was smuggled out to Phocis. He returned as an adult with his sister Electra to slay Clytemnestra and Aegistheus. He then fled to Athens to evade justice and a matricide, and became insane for a time. Meanwhile, the throne of Mycenae went to Aletes, son of Aegistheus, but not for long. Recovering, Orestes returned to Mycenae with Electra to kill Aletes and took the throne. This story is told in numerous plays, including the Oresteia, Sophocles' Electra, and Euripides' Electra.\nEnd of the Atreids\nOrestes then built a larger state in the Peloponnese, but he died in Arcadia from a snake bite. His son, Tisamenus, the last of the Atreid dynasty, was killed by the Heracleidae on their return to the Peloponnesus. They claimed the right of the Perseids to inherit the various kingdoms of the Peloponnese and cast lots for the dominion of them, thus leaving the Atreids as the final rulers of Legendary Mycenae."]], "1_corpus": "wikipedia"}
|
|
|
|
79 |
{"tstamp": 1722575021.6602, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "26338595df2d4faf807e22811bc1b1f0", "0_model_name": "BAAI/bge-large-en-v1.5", "0_prompt": "dialog systems that use embeddings, for LLM-based chatbots", "0_output": [["dialog systems that use embeddings, for LLM-based chatbots", "Large language model\n\nA large language model (LLM) is a computational model notable for its ability to achieve general-purpose language generation and other natural language processing tasks such as classification. Based on language models, LLMs acquire these abilities by learning statistical relationships from vast amounts of text during a computationally intensive self-supervised and semi-supervised training process. LLMs can be used for text generation, a form of generative AI, by taking an input text and repeatedly predicting the next token or word.\nLLMs are artificial neural networks that utilize the transformer architecture, invented in 2017. The largest and most capable LLMs, , are built with a decoder-only transformer-based architecture, which enables efficient processing and generation of large-scale text data."]], "0_corpus": "wikipedia", "1_conv_id": "314b01d92c5c4dc4b512193270b12595", "1_model_name": "intfloat/e5-mistral-7b-instruct", "1_prompt": "dialog systems that use embeddings, for LLM-based chatbots", "1_output": [["dialog systems that use embeddings, for LLM-based chatbots", "Large language model\n\nA large language model (LLM) is a computational model notable for its ability to achieve general-purpose language generation and other natural language processing tasks such as classification. Based on language models, LLMs acquire these abilities by learning statistical relationships from vast amounts of text during a computationally intensive self-supervised and semi-supervised training process. LLMs can be used for text generation, a form of generative AI, by taking an input text and repeatedly predicting the next token or word.\nLLMs are artificial neural networks that utilize the transformer architecture, invented in 2017. The largest and most capable LLMs, , are built with a decoder-only transformer-based architecture, which enables efficient processing and generation of large-scale text data."]], "1_corpus": "wikipedia"}
|
80 |
{"tstamp": 1722575077.532, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "9245f99d83c74c4d9a4dbf5b87dc5535", "0_model_name": "intfloat/e5-mistral-7b-instruct", "0_prompt": "dialog systems that use embeddings, for LLM-based chatbots", "0_output": [["dialog systems that use embeddings, for LLM-based chatbots", "Title: Exploring and Characterizing Large Language Models For Embedded System Development and Debugging\n\nAbstract: Large language models (LLMs) have shown remarkable abilities to generate code, however their ability to develop software for embedded systems, which requires cross-domain knowledge of hardware and software has not been studied. In this paper we develop an extensible, open source hardware-in-the-loop framework to systematically evaluate leading LLMs (GPT-3.5, GPT-4, PaLM 2) to assess their capabilities and limitations for embedded system development. We observe through our study that even when these tools fail to produce working code, they consistently generate helpful reasoning about embedded design tasks. We leverage this finding to study how human programmers interact with these tools, and develop an human-AI based software engineering workflow for building embedded systems. Our evaluation platform for verifying LLM generated programs uses sensor actuator pairs for physical evaluation. We compare all three models with N=450 experiments and find surprisingly that GPT-4 especially shows an exceptional level of cross-domain understanding and reasoning, in some cases generating fully correct programs from a single prompt. In N=50 trials, GPT-4 produces functional I2C interfaces 66% of the time. GPT-4 also produces register-level drivers, code for LoRa communication, and context-specific power optimizations for an nRF52 program resulting in over 740x current reduction to 12.2uA. We also characterize the models' limitations to develop a generalizable human-AI workflow for using LLMs in embedded system development. We evaluate our workflow with 15 users including novice and expert programmers. We find that our workflow improves productivity for all users and increases the success rate for building a LoRa environmental sensor from 25% to 100%, including for users with zero hardware or C/C++ experience."]], "0_corpus": "arxiv", "1_conv_id": "61900cd85b4241d8a19119f6f1b16464", "1_model_name": "GritLM/GritLM-7B", "1_prompt": "dialog systems that use embeddings, for LLM-based chatbots", "1_output": [["dialog systems that use embeddings, for LLM-based chatbots", "Title: Domain-specific ChatBots for Science using Embeddings\n\nAbstract: Large language models (LLMs) have emerged as powerful machine-learning systems capable of handling a myriad of tasks. Tuned versions of these systems have been turned into chatbots that can respond to user queries on a vast diversity of topics, providing informative and creative replies. However, their application to physical science research remains limited owing to their incomplete knowledge in these areas, contrasted with the needs of rigor and sourcing in science domains. Here, we demonstrate how existing methods and software tools can be easily combined to yield a domain-specific chatbot. The system ingests scientific documents in existing formats, and uses text embedding lookup to provide the LLM with domain-specific contextual information when composing its reply. We similarly demonstrate that existing image embedding methods can be used for search and retrieval across publication figures. These results confirm that LLMs are already suitable for use by physical scientists in accelerating their research efforts."]], "1_corpus": "arxiv"}
|
81 |
{"tstamp": 1722579503.8059, "task_type": "retrieval", "type": "tievote", "models": ["", ""], "ip": "", "0_conv_id": "c6a86245ccb44e83a15f4767cef256fc", "0_model_name": "voyage-multilingual-2", "0_prompt": "who helped them recapture mycenae once they were old enough to fight", "0_output": [["who helped them recapture mycenae once they were old enough to fight", "Mycenae\n\nAtreid dynasty The people of Mycenae had received advice from an oracle that they should choose a new king from among the Pelopids. The two contenders were Atreus and his brother, Thyestes. The latter was chosen at first. At this moment nature intervened and the sun appeared to reverse direction by setting in the east. Atreus argued that because the sun had reversed its path, the election of Thyestes should be reversed. The argument was heeded, and Atreus became king. His first move was to pursue Thyestes and all his family – that is, his own kin – but Thyestes managed to escape from Mycenae.\nIn legend, Atreus had two sons, Agamemnon and Menelaus, the Atreids. Aegisthus, the son of Thyestes, killed Atreus and restored Thyestes to the throne. With the help of King Tyndareus of Sparta, the Atreids drove Thyestes again into exile. Tyndareus had two ill-starred daughters, Helen and Clytemnestra, whom Menelaus and Agamemnon married, respectively. Agamemnon inherited Mycenae and Menelaus became king of Sparta.\nHomeric Poems\nSoon, Helen eloped with Paris of Troy. Agamemnon conducted a 10-year war against Troy to get her back for his brother. Because of lack of wind, the warships could not sail to Troy. In order to please the gods so that they might make the winds start to blow, Agamemnon sacrificed his daughter Iphigenia. According to some versions of the legend, the hunting goddess Artemis replaced her at the very last moment with a deer on the altar, and took Iphigenia to Tauris (see Iphigenia in Tauris by Euripides). The deities, having been satisfied by such a sacrifice, made the winds blow and the Greek fleet departed."]], "0_corpus": "wikipedia", "1_conv_id": "732db75064584027960d547439a205ca", "1_model_name": "text-embedding-004", "1_prompt": "who helped them recapture mycenae once they were old enough to fight", "1_output": [["who helped them recapture mycenae once they were old enough to fight", "Mycenae\n\nLegend tells us that the long and arduous Trojan War, although nominally a Greek victory, brought anarchy, piracy, and ruin; already before the Greek fleet set sail for Troy, the conflict had divided the gods as well, and this contributed to curses and acts of vengeance following many of the Greek heroes. After the war Agamemnon returned to Mycenae and was greeted royally with a red carpet rolled out for him. Shortly thereafter, he was slain by Clytemnestra, who hated him bitterly for having ordered the sacrifice of their daughter Iphigenia in order to gain favorable winds to Troy. Clytemnestra was aided in her crime by Aegistheus, her lover, who reigned subsequently, but Orestes, her son by Agamemnon, was smuggled out to Phocis. He returned as an adult with his sister Electra to slay Clytemnestra and Aegistheus. He then fled to Athens to evade justice and a matricide, and became insane for a time. Meanwhile, the throne of Mycenae went to Aletes, son of Aegistheus, but not for long. Recovering, Orestes returned to Mycenae with Electra to kill Aletes and took the throne. This story is told in numerous plays, including the Oresteia, Sophocles' Electra, and Euripides' Electra.\nEnd of the Atreids\nOrestes then built a larger state in the Peloponnese, but he died in Arcadia from a snake bite. His son, Tisamenus, the last of the Atreid dynasty, was killed by the Heracleidae on their return to the Peloponnesus. They claimed the right of the Perseids to inherit the various kingdoms of the Peloponnese and cast lots for the dominion of them, thus leaving the Atreids as the final rulers of Legendary Mycenae."]], "1_corpus": "wikipedia"}
|
82 |
+
{"tstamp": 1722580525.0932, "task_type": "retrieval", "type": "rightvote", "models": ["", ""], "ip": "", "0_conv_id": "4c79178fcee5443abea551c4946d3806", "0_model_name": "BM25", "0_prompt": "what are the best tools for extracting clinical trial data from pdf research papers?", "0_output": [["what are the best tools for extracting clinical trial data from pdf research papers?", "Stats Stackexchange\n\nQ: Books on designing clinical trials for psychotherapy research I'm designing a clinical trial that will evaluate a behavioral therapy against a control condition, and I'm looking for some good texts on this topic. There are a lot of books on clinical trials in medicine, but I've had a hard time finding one that also covers the topics specific to psychotherapy research. \nFor instance, topics like choosing an appropriate control group and evaluating the therapist's adherence and competence are different in psychotherapy research than in drug trials. Any tips on a comprehensive book on this topic would be welcome. \n\nA: I think that much of the methodology of general books on clinical trial design would apply to psychotherapy research. But in hunting on the internet I did find this one book on methodology for medical research in clinical psychology.\nResearch Design in Clinical Psychology\n"]], "0_corpus": "stackexchange", "1_conv_id": "43d39fdb5a8e45f3aea2d9c14a9ead1d", "1_model_name": "nomic-ai/nomic-embed-text-v1.5", "1_prompt": "what are the best tools for extracting clinical trial data from pdf research papers?", "1_output": [["what are the best tools for extracting clinical trial data from pdf research papers?", "Stackoverflow Stackexchange\n\nQ: Extracting information from PDFs of research papers I need a mechanism for extracting bibliographic metadata from PDF documents, to save people entering it by hand or cut-and-pasting it.\nAt the very least, the title and abstract. The list of authors and their affiliations would be good. Extracting out the references would be amazing.\nIdeally this would be an open source solution. \nThe problem is that not all PDF's encode the text, and many which do fail to preserve the logical order of the text, so just doing pdf2text gives you line 1 of column 1, line 1 of column 2, line 2 of column 1 etc.\nI know there's a lot of libraries. It's identifying the abstract, title authors etc. on the document that I need to solve. This is never going to be possible every time, but 80% would save a lot of human effort.\n\nA: We ran a contest to solve this problem at Dev8D in London, Feb 2010 and we got a nice little GPL tool created as a result. We've not yet integrated it into our systems but it's there in the world.\nhttps://code.google.com/p/pdfssa4met/\n"]], "1_corpus": "stackexchange"}
|
data/retrieval_individual-928d45d9-19ab-4d05-9aae-4b6574886d0d.jsonl
CHANGED
@@ -297,3 +297,7 @@
|
|
297 |
{"tstamp": 1722579451.245, "task_type": "retrieval", "type": "chat", "model": "text-embedding-004", "gen_params": {}, "start": 1722579448.4149, "finish": 1722579451.245, "ip": "", "conv_id": "732db75064584027960d547439a205ca", "model_name": "text-embedding-004", "prompt": "who helped them recapture mycenae once they were old enough to fight", "output": [["who helped them recapture mycenae once they were old enough to fight", "Mycenae\n\nLegend tells us that the long and arduous Trojan War, although nominally a Greek victory, brought anarchy, piracy, and ruin; already before the Greek fleet set sail for Troy, the conflict had divided the gods as well, and this contributed to curses and acts of vengeance following many of the Greek heroes. After the war Agamemnon returned to Mycenae and was greeted royally with a red carpet rolled out for him. Shortly thereafter, he was slain by Clytemnestra, who hated him bitterly for having ordered the sacrifice of their daughter Iphigenia in order to gain favorable winds to Troy. Clytemnestra was aided in her crime by Aegistheus, her lover, who reigned subsequently, but Orestes, her son by Agamemnon, was smuggled out to Phocis. He returned as an adult with his sister Electra to slay Clytemnestra and Aegistheus. He then fled to Athens to evade justice and a matricide, and became insane for a time. Meanwhile, the throne of Mycenae went to Aletes, son of Aegistheus, but not for long. Recovering, Orestes returned to Mycenae with Electra to kill Aletes and took the throne. This story is told in numerous plays, including the Oresteia, Sophocles' Electra, and Euripides' Electra.\nEnd of the Atreids\nOrestes then built a larger state in the Peloponnese, but he died in Arcadia from a snake bite. His son, Tisamenus, the last of the Atreid dynasty, was killed by the Heracleidae on their return to the Peloponnesus. They claimed the right of the Perseids to inherit the various kingdoms of the Peloponnese and cast lots for the dominion of them, thus leaving the Atreids as the final rulers of Legendary Mycenae."]], "corpus": "wikipedia"}
|
298 |
{"tstamp": 1722579518.0642, "task_type": "retrieval", "type": "chat", "model": "BM25", "gen_params": {}, "start": 1722579517.6491, "finish": 1722579518.0642, "ip": "", "conv_id": "5a794e433983467cad2ffecf1a757175", "model_name": "BM25", "prompt": "Why did the machines in The Matrix keep humans around?", "output": [["Why did the machines in The Matrix keep humans around?", "Scifi Stackexchange\n\nQ: What happens to a person after they die in the Matrix? Machines keep humans in the Matrix as they are a source of energy. Why did they make it so that when a human is hurt in the Matrix, they're bodies hurt themselves in real life (in the tank)? If a human dies in the Matrix, do they die in the real world? Or just get flushed out of their tank? Isn't it kind of a waste?\n\nA: The body cannot live without the mind. The machines can't get around that. But nothing is wasted. They liquefy the dead so that they can be fed intravenously to the living.\n\n\n\nJump to 45 seconds for Morpheus' first hand explanation.\n"]], "corpus": "stackexchange"}
|
299 |
{"tstamp": 1722579518.0642, "task_type": "retrieval", "type": "chat", "model": "sentence-transformers/all-MiniLM-L6-v2", "gen_params": {}, "start": 1722579517.6491, "finish": 1722579518.0642, "ip": "", "conv_id": "b788b837ab4f4f6585bb6ae346833dd8", "model_name": "sentence-transformers/all-MiniLM-L6-v2", "prompt": "Why did the machines in The Matrix keep humans around?", "output": [["Why did the machines in The Matrix keep humans around?", "Gis Stackexchange\n\nQ: Filling voids in DEM SRTM 1-arc second using QGIS by resampling and inserting another DEM I'd like to use the DEM SRTM 1-arc second (Version 4, 30m) from USGS for Europe. Unfortunately there are some voids. My idea is to fill those voids using the void-filled version 3 (90m) by resampling it to 30m and inserting the missing data in the version 4 DEM.\nAs I'm new to QGIS I don't know the proper steps. I already got a 30m map by reprojecting the dimensions times three but I'm not sure if that's the right thing to do and I have no idea how to fill the voids afterwards. \n"]], "corpus": "stackexchange"}
|
|
|
|
|
|
|
|
|
|
297 |
{"tstamp": 1722579451.245, "task_type": "retrieval", "type": "chat", "model": "text-embedding-004", "gen_params": {}, "start": 1722579448.4149, "finish": 1722579451.245, "ip": "", "conv_id": "732db75064584027960d547439a205ca", "model_name": "text-embedding-004", "prompt": "who helped them recapture mycenae once they were old enough to fight", "output": [["who helped them recapture mycenae once they were old enough to fight", "Mycenae\n\nLegend tells us that the long and arduous Trojan War, although nominally a Greek victory, brought anarchy, piracy, and ruin; already before the Greek fleet set sail for Troy, the conflict had divided the gods as well, and this contributed to curses and acts of vengeance following many of the Greek heroes. After the war Agamemnon returned to Mycenae and was greeted royally with a red carpet rolled out for him. Shortly thereafter, he was slain by Clytemnestra, who hated him bitterly for having ordered the sacrifice of their daughter Iphigenia in order to gain favorable winds to Troy. Clytemnestra was aided in her crime by Aegistheus, her lover, who reigned subsequently, but Orestes, her son by Agamemnon, was smuggled out to Phocis. He returned as an adult with his sister Electra to slay Clytemnestra and Aegistheus. He then fled to Athens to evade justice and a matricide, and became insane for a time. Meanwhile, the throne of Mycenae went to Aletes, son of Aegistheus, but not for long. Recovering, Orestes returned to Mycenae with Electra to kill Aletes and took the throne. This story is told in numerous plays, including the Oresteia, Sophocles' Electra, and Euripides' Electra.\nEnd of the Atreids\nOrestes then built a larger state in the Peloponnese, but he died in Arcadia from a snake bite. His son, Tisamenus, the last of the Atreid dynasty, was killed by the Heracleidae on their return to the Peloponnesus. They claimed the right of the Perseids to inherit the various kingdoms of the Peloponnese and cast lots for the dominion of them, thus leaving the Atreids as the final rulers of Legendary Mycenae."]], "corpus": "wikipedia"}
|
298 |
{"tstamp": 1722579518.0642, "task_type": "retrieval", "type": "chat", "model": "BM25", "gen_params": {}, "start": 1722579517.6491, "finish": 1722579518.0642, "ip": "", "conv_id": "5a794e433983467cad2ffecf1a757175", "model_name": "BM25", "prompt": "Why did the machines in The Matrix keep humans around?", "output": [["Why did the machines in The Matrix keep humans around?", "Scifi Stackexchange\n\nQ: What happens to a person after they die in the Matrix? Machines keep humans in the Matrix as they are a source of energy. Why did they make it so that when a human is hurt in the Matrix, they're bodies hurt themselves in real life (in the tank)? If a human dies in the Matrix, do they die in the real world? Or just get flushed out of their tank? Isn't it kind of a waste?\n\nA: The body cannot live without the mind. The machines can't get around that. But nothing is wasted. They liquefy the dead so that they can be fed intravenously to the living.\n\n\n\nJump to 45 seconds for Morpheus' first hand explanation.\n"]], "corpus": "stackexchange"}
|
299 |
{"tstamp": 1722579518.0642, "task_type": "retrieval", "type": "chat", "model": "sentence-transformers/all-MiniLM-L6-v2", "gen_params": {}, "start": 1722579517.6491, "finish": 1722579518.0642, "ip": "", "conv_id": "b788b837ab4f4f6585bb6ae346833dd8", "model_name": "sentence-transformers/all-MiniLM-L6-v2", "prompt": "Why did the machines in The Matrix keep humans around?", "output": [["Why did the machines in The Matrix keep humans around?", "Gis Stackexchange\n\nQ: Filling voids in DEM SRTM 1-arc second using QGIS by resampling and inserting another DEM I'd like to use the DEM SRTM 1-arc second (Version 4, 30m) from USGS for Europe. Unfortunately there are some voids. My idea is to fill those voids using the void-filled version 3 (90m) by resampling it to 30m and inserting the missing data in the version 4 DEM.\nAs I'm new to QGIS I don't know the proper steps. I already got a 30m map by reprojecting the dimensions times three but I'm not sure if that's the right thing to do and I have no idea how to fill the voids afterwards. \n"]], "corpus": "stackexchange"}
|
300 |
+
{"tstamp": 1722580477.1041, "task_type": "retrieval", "type": "chat", "model": "BM25", "gen_params": {}, "start": 1722580476.8653, "finish": 1722580477.1041, "ip": "", "conv_id": "4c79178fcee5443abea551c4946d3806", "model_name": "BM25", "prompt": "what are the best tools for extracting clinical trial data from pdf research papers?", "output": [["what are the best tools for extracting clinical trial data from pdf research papers?", "Stats Stackexchange\n\nQ: Books on designing clinical trials for psychotherapy research I'm designing a clinical trial that will evaluate a behavioral therapy against a control condition, and I'm looking for some good texts on this topic. There are a lot of books on clinical trials in medicine, but I've had a hard time finding one that also covers the topics specific to psychotherapy research. \nFor instance, topics like choosing an appropriate control group and evaluating the therapist's adherence and competence are different in psychotherapy research than in drug trials. Any tips on a comprehensive book on this topic would be welcome. \n\nA: I think that much of the methodology of general books on clinical trial design would apply to psychotherapy research. But in hunting on the internet I did find this one book on methodology for medical research in clinical psychology.\nResearch Design in Clinical Psychology\n"]], "corpus": "stackexchange"}
|
301 |
+
{"tstamp": 1722580477.1041, "task_type": "retrieval", "type": "chat", "model": "nomic-ai/nomic-embed-text-v1.5", "gen_params": {}, "start": 1722580476.8653, "finish": 1722580477.1041, "ip": "", "conv_id": "43d39fdb5a8e45f3aea2d9c14a9ead1d", "model_name": "nomic-ai/nomic-embed-text-v1.5", "prompt": "what are the best tools for extracting clinical trial data from pdf research papers?", "output": [["what are the best tools for extracting clinical trial data from pdf research papers?", "Stackoverflow Stackexchange\n\nQ: Extracting information from PDFs of research papers I need a mechanism for extracting bibliographic metadata from PDF documents, to save people entering it by hand or cut-and-pasting it.\nAt the very least, the title and abstract. The list of authors and their affiliations would be good. Extracting out the references would be amazing.\nIdeally this would be an open source solution. \nThe problem is that not all PDF's encode the text, and many which do fail to preserve the logical order of the text, so just doing pdf2text gives you line 1 of column 1, line 1 of column 2, line 2 of column 1 etc.\nI know there's a lot of libraries. It's identifying the abstract, title authors etc. on the document that I need to solve. This is never going to be possible every time, but 80% would save a lot of human effort.\n\nA: We ran a contest to solve this problem at Dev8D in London, Feb 2010 and we got a nice little GPL tool created as a result. We've not yet integrated it into our systems but it's there in the world.\nhttps://code.google.com/p/pdfssa4met/\n"]], "corpus": "stackexchange"}
|
302 |
+
{"tstamp": 1722580558.6454, "task_type": "retrieval", "type": "chat", "model": "nomic-ai/nomic-embed-text-v1.5", "gen_params": {}, "start": 1722580556.4212, "finish": 1722580558.6454, "ip": "", "conv_id": "87ef1a3903964a95b8c075110c4c7ce8", "model_name": "nomic-ai/nomic-embed-text-v1.5", "prompt": "what are the best tools for extracting clinical trial data from pdf research papers?", "output": [["what are the best tools for extracting clinical trial data from pdf research papers?", "Title: Trialstreamer: Mapping and Browsing Medical Evidence in Real-Time\n\nAbstract: We introduce Trialstreamer, a living database of clinical trial reports. Here we mainly describe the evidence extraction component; this extracts from biomedical abstracts key pieces of information that clinicians need when appraising the literature, and also the relations between these. Specifically, the system extracts descriptions of trial participants, the treatments compared in each arm (the interventions), and which outcomes were measured. The system then attempts to infer which interventions were reported to work best by determining their relationship with identified trial outcome measures. In addition to summarizing individual trials, these extracted data elements allow automatic synthesis of results across many trials on the same topic. We apply the system at scale to all reports of randomized controlled trials indexed in MEDLINE, powering the automatic generation of evidence maps, which provide a global view of the efficacy of different interventions combining data from all relevant clinical trials on a topic. We make all code and models freely available alongside a demonstration of the web interface."]], "corpus": "arxiv"}
|
303 |
+
{"tstamp": 1722580558.6454, "task_type": "retrieval", "type": "chat", "model": "BAAI/bge-large-en-v1.5", "gen_params": {}, "start": 1722580556.4212, "finish": 1722580558.6454, "ip": "", "conv_id": "876460b3fbb54299a26baca5870d9ddd", "model_name": "BAAI/bge-large-en-v1.5", "prompt": "what are the best tools for extracting clinical trial data from pdf research papers?", "output": [["what are the best tools for extracting clinical trial data from pdf research papers?", "Title: A Benchmark of PDF Information Extraction Tools using a Multi-Task and Multi-Domain Evaluation Framework for Academic Documents\n\nAbstract: Extracting information from academic PDF documents is crucial for numerous indexing, retrieval, and analysis use cases. Choosing the best tool to extract specific content elements is difficult because many, technically diverse tools are available, but recent performance benchmarks are rare. Moreover, such benchmarks typically cover only a few content elements like header metadata or bibliographic references and use smaller datasets from specific academic disciplines. We provide a large and diverse evaluation framework that supports more extraction tasks than most related datasets. Our framework builds upon DocBank, a multi-domain dataset of 1.5M annotated content elements extracted from 500K pages of research papers on arXiv. Using the new framework, we benchmark ten freely available tools in extracting document metadata, bibliographic references, tables, and other content elements from academic PDF documents. GROBID achieves the best metadata and reference extraction results, followed by CERMINE and Science Parse. For table extraction, Adobe Extract outperforms other tools, even though the performance is much lower than for other content elements. All tools struggle to extract lists, footers, and equations. We conclude that more research on improving and combining tools is necessary to achieve satisfactory extraction quality for most content elements. Evaluation datasets and frameworks like the one we present support this line of research. We make our data and code publicly available to contribute toward this goal."]], "corpus": "arxiv"}
|