--- pipeline_tag: text-generation inference: true widget: - text: 'def print_hello_world():' example_title: Hello world group: Python license: bigcode-openrail-m datasets: - bigcode/commitpackft - bigcode/oasst-octopack metrics: - code_eval library_name: transformers tags: - code model-index: - name: OctoCoder results: - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalSynthesize Python metrics: - name: pass@1 type: pass@1 value: 46.2 verified: false - task: type: text-generation dataset: type: bigcode/humanevalpack name: HumanEvalSynthesize JavaScript metrics: - name: pass@1 type: pass@1 value: 39.2 verified: false --- ![Octopack](https://github.com/bigcode-project/octopack/blob/31f3320f098703c7910e43492c39366eeea68d83/banner.png?raw=true) # OctoCoder Play with the model on the [TODO Playground](https://huggingface.co/spaces/bigcode/bigcode-playground). ## Table of Contents 1. [Model Summary](##model-summary) 2. [Use](##use) 3. [Limitations](##limitations) 4. [Training](##training) 5. [License](##license) 6. [Citation](##citation) ## Model Summary OctoCoder is an instruction tuned model with 15.5B parameters created by finetuning StarCoder on CommitPackFT & OASST as described in the OctoPack paper. - **Repository:** [bigcode/octopack](https://github.com/bigcode-project/octopack) - **Paper:** [TODO]() - **Languages:** 80+ Programming languages - **OctoPack🐙🎒:**
Data | CommitPack | 4TB of GitHub commits across 350 programming languages |
---|---|---|
CommitPackFT | Filtered version of CommitPack for high-quality commit messages that resemble instructions | |
Model | OctoCoder | StarCoder (16B parameters) instruction tuned on CommitPackFT + OASST |
OctoGeeX | CodeGeeX2 (6B parameters) instruction tuned on CommitPackFT + OASST | |
Evaluation | HumanEvalPack | Extension of OpenAI's HumanEval to cover 3 scenarios across 6 languages |