|
--- |
|
library_name: transformers |
|
license: apache-2.0 |
|
--- |
|
|
|
# FANNO |
|
<!-- Provide a quick summary of what the model is/does. --> |
|
|
|
|
|
|
|
## Model Details |
|
Based on LlaMA2-7b. |
|
|
|
### Model Description |
|
|
|
instruction-finetuning |
|
|
|
autonomous-framework |
|
|
|
data-annotation |
|
|
|
FANNO is an innovative, fully autonomous, open-sourced framework designed to streamline the annotation process for instruction datasets without requiring pre-existing annotated data. Leveraging the capabilities of the Mistral-7b-instruct model, FANNO efficiently generates diverse and high-quality datasets through a structured process that includes document pre-screening, instruction generation, and response generation. |
|
|
|
### Key Features |
|
|
|
Autonomous Annotation: Eliminates the need for manual annotations or costly API calls of proprietary LLMs, making the annotation process cost-effective and efficient. |
|
|
|
High-Quality Data Generation: Produces datasets with diversity and complexity that are comparable to human-annotated or cleaned datasets, such as Alpaca-GPT4-Cleaned. |
|
|
|
Open-Sourced Framework: Fully open-sourced, allowing the community to leverage and contribute to the ongoing improvement of the framework. |
|
|