File size: 817 Bytes
e4752e3
de0a1d1
 
 
e4752e3
de0a1d1
e4752e3
de0a1d1
 
 
e4752e3
de0a1d1
 
 
 
e4752e3
de0a1d1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
{}
---
Encoder only version of the [ANKH2 large model](https://huggingface.co/ElnaggarLab/ankh2-ext1) (paper not released yet for ANKH2). The encoder only version is ideal for protein representation tasks.

## To download
```python
from transformers import T5EncoderModel, AutoTokenizer

model_path = 'Synthyra/ANKH2_large'
model = T5EncoderModel.from_pretrained(model_path)
tokenizer = AutoTokenizer.from_pretrained(model_path)
```

We are working on implementing a version of T5 based PLMs with [Flex attention](https://pytorch.org/blog/flexattention/) once learned relative position bias is supported (used in T5). Stay tuned.