File size: 531 Bytes
c909d0b
 
225765e
 
 
 
c909d0b
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13

# PubMedBERT Abstract + Full Text Fine-Tuned on QNLI Task

Link to the original model: https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext

Credits to the paper:

> @misc{pubmedbert,   author = {Yu Gu and Robert Tinn and Hao Cheng and
> Michael Lucas and Naoto Usuyama and Xiaodong Liu and Tristan Naumann
> and Jianfeng Gao and Hoifung Poon},   title = {Domain-Specific
> Language Model Pretraining for Biomedical Natural Language
> Processing},   year = {2020},   eprint = {arXiv:2007.15779}, }