Text Generation
Transformers
English
codegen
Inference Endpoints
codegen-6B-lora / README.md
mhhmm's picture
Update README.md
cd01633
|
raw
history blame
599 Bytes
---
license: mit
datasets:
- mhhmm/leetcode-solutions-python
- deepmind/code_contests
language:
- en
pipeline_tag: text-generation
library_name: transformers
---
LLM: [Salesforce/CodeGen-6B-Mono](https://huggingface.co/Salesforce/codegen-6B-mono)
I'm using [Peft](https://github.com/huggingface/peft) for tuning
Tuning:
- [LoRA](https://github.com/microsoft/LoRA)
- [Leetcode](https://huggingface.co/datasets/mhhmm/leetcode-solutions-python)
- [Google Deepind Code contests](https://huggingface.co/datasets/deepmind/code_contests)
- Google Colab Pro+ in 2 hours, shoutout to my friend TieuPhuong