--- datasets: - ajibawa-2023/Code-290k-ShareGPT language: - en tags: - code --- **Code-290k-6.7B-Instruct** This model is trained on [DeepSeek-Coder-6.7B-Instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct). I have used my existing dataset [Code-290k-ShareGPT](https://huggingface.co/datasets/ajibawa-2023/Code-290k-ShareGPT) for training purpose. It is trained on around 290000 set of codes. Along with Python, Java, JavaScript, GO, C++, Rust, Ruby, Sql, MySql, R, Julia, Haskell, etc. code with detailed explanation is used for training purpose. This model utilises Alpaca format. Besides code generation it will also give you explanation. **Training:** Entire dataset was trained on 4 x A100 80GB. For 3 epoch, training took 85 hours. DeepSeek-Coder codebase and DeepSpeed was used for training purpose. This is a full fine tuned model. Example Prompt: ``` This is a conversation with your helpful AI assistant. AI assistant can generate Code in various Programming Languages along with necessary explanation. ### Instruction: {instruction} ### Response: ``` You can modify above Prompt as per your requirement. I have used Alpaca format. I want to say special Thanks to the Open Source community for helping & guiding me to better understand the AI/Model development. Thank you for your love & support. **Examples** 1. **Bayes Theorem - Python** ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/J8uqoT_LYhPW2VpnE1K-8.png) 2. **Fermat's little theorem** ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/H0sc9jk7ypv_N5V7LSANl.png) 3. **The Arrhenius equation using R** ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/BQ8PZhYhoZ9wpVMPXJPnQ.png)