--- tags: - adapterhub:chemistry - adapter-transformers - t5 datasets: - uspto --- # Adapter `doncamilom/OChemSegm-flan-T5-large` for google/flan-t5-large An [adapter](https://adapterhub.ml) for the `google/flan-t5-large` model that was trained on the [USPTO-segment](www.tobedone.undone) dataset. This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library. ## Usage First, install `adapter-transformers`: ``` pip install -U adapter-transformers ``` _Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_ Now, the adapter can be loaded and activated like this: ```python from transformers import AutoModelForSeq2SeqLM, AutoTokenizer adapter = 'doncamilom/OChemSegm-flan-T5-large' model = AutoModelForSeq2SeqLM.from_pretrained( 'google/flan-t5-large', ) # Load adapter adapter_name = self.model.load_adapter(adapter, source='hf', set_active=True) # Load tokenizer tokenizer = AutoTokenizer.from_pretrained(adapter) ``` ## Architecture & Training ## Evaluation results ## Citation