File size: 984 Bytes
a9c16ad 58833fb a9c16ad 58833fb dbccff2 4215e83 58833fb dbccff2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
license: other
language:
- en
library_name: transformers
inference: false
thumbnail: https://h2o.ai/etc.clientlibs/h2o/clientlibs/clientlib-site/resources/images/favicon.ico
tags:
- gpt
- llm
- large language model
- LLaMa
datasets:
- h2oai/h2ogpt-oig-oasst1-instruct-cleaned-v2
---
# h2oGPT Model Card
## Summary
H2O.ai's `h2oai/h2ogpt-research-oig-oasst1-512-30b` is a 30 billion parameter instruction-following large language model for research use only.
- Base model [decapoda-research/llama-30b-hf](https://huggingface.co/decapoda-research/llama-30b-hf)
- LORA [h2oai/h2ogpt-research-oig-oasst1-512-30b-lora](https://huggingface.co/h2oai/h2ogpt-research-oig-oasst1-512-30b-lora)
- This HF version was built using the [export script and steps](https://huggingface.co/h2oai/h2ogpt-research-oig-oasst1-512-30b-lora#build-hf-model)
All details about performance etc. are provided in the [LORA Model Card](https://huggingface.co/h2oai/h2ogpt-research-oig-oasst1-512-30b-lora).
|