PubMedBERT-QNLI / README.md
mervenoyan's picture
Update README.md
c909d0b
|
raw
history blame
531 Bytes
# PubMedBERT Abstract + Full Text Fine-Tuned on QNLI Task
Link to the original model: https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext
Credits to the paper:
> @misc{pubmedbert, author = {Yu Gu and Robert Tinn and Hao Cheng and
> Michael Lucas and Naoto Usuyama and Xiaodong Liu and Tristan Naumann
> and Jianfeng Gao and Hoifung Poon}, title = {Domain-Specific
> Language Model Pretraining for Biomedical Natural Language
> Processing}, year = {2020}, eprint = {arXiv:2007.15779}, }