--- # For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1 # Doc / guide: https://huggingface.co/docs/hub/model-cards {} --- Encoder only version of the [ANKH base model](https://huggingface.co/ElnaggarLab/ankh-base) ([paper](https://arxiv.org/abs/2301.06568)). The encoder only version is ideal for protein representation tasks. ## To download ```python from transformers import T5EncoderModel, AutoTokenizer model_path = 'Synthyra/ANKH_base' model = T5EncoderModel.from_pretrained(model_path) tokenizer = AutoTokenizer.from_pretrained(model_path) ``` We are working on implementing a version of T5 based PLMs with [Flex attention](https://pytorch.org/blog/flexattention/) once learned relative position bias is supported (used in T5). Stay tuned.