chiyuzhang
commited on
Commit
·
379119a
1
Parent(s):
794f4fc
Update README.md
Browse files
README.md
CHANGED
@@ -7,33 +7,42 @@ tags:
|
|
7 |
- social media
|
8 |
- contrastive learning
|
9 |
---
|
10 |
-
#
|
11 |
|
12 |
-
<p align="center"> <a href="https://chiyuzhang94.github.io/" target="_blank">Chiyu Zhang</a>,
|
13 |
<p align="center" float="left">
|
14 |
-
|
15 |
-
<p align="center">Publish at Findings of ACL 2023</p>
|
16 |
|
17 |
-
|
18 |
-
|
|
|
19 |
|
|
|
|
|
20 |
|
21 |
-
|
22 |
-
|
23 |
-
</p>
|
24 |
-
Illustration of our proposed InfoDCL framework. We exploit distant/surrogate labels (i.e., emojis) to supervise two contrastive losses, corpus-aware contrastive loss (CCL) and Light label-aware contrastive loss (LCL-LiT). Sequence representations from our model should keep the cluster of each class distinguishable and preserve semantic relationships between classes.
|
25 |
|
26 |
## Checkpoints of Models Pre-Trained with InfoDCL
|
27 |
-
|
28 |
-
* InfoDCL-RoBERTa trained with TweetEmoji-EN: https://huggingface.co/UBC-NLP/InfoDCL-emoji
|
29 |
-
* InfoDCL-RoBERTa trained with TweetHashtag-EN: https://huggingface.co/UBC-NLP/InfoDCL-hashtag
|
30 |
|
31 |
Multilingual Model:
|
32 |
* InfoDCL-XLMR trained with multilingual TweetEmoji-multi: https://huggingface.co/UBC-NLP/InfoDCL-Emoji-XLMR-Base
|
33 |
|
34 |
-
|
|
|
|
|
35 |
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
7 |
- social media
|
8 |
- contrastive learning
|
9 |
---
|
10 |
+
# The Skipped Beat: A Study of Sociopragmatic Understanding in LLMs for 64 Languages
|
11 |
|
12 |
+
<p align="center"> <a href="https://chiyuzhang94.github.io/" target="_blank">Chiyu Zhang</a>, Khai Duy Doan, Qisheng Liao, <a href="https://mageed.arts.ubc.ca/" target="_blank">Muhammad Abdul-Mageed</a></p>
|
13 |
<p align="center" float="left">
|
|
|
|
|
14 |
|
15 |
+
<p align="center" float="left">
|
16 |
+
The University of British Columbia, Mohamed bin Zayed University of Artificial Intelligence
|
17 |
+
</p>
|
18 |
|
19 |
+
<p align="center">Publish at Main Conference of EMNLP 2023</p>
|
20 |
+
<p align="center"> <a href="https://arxiv.org/abs/2310.14557" target="_blank">Paper</a></p>
|
21 |
|
22 |
+
[![Code License](https://img.shields.io/badge/Code%20License-Apache_2.0-green.svg)]()
|
23 |
+
[![Data License](https://img.shields.io/badge/Data%20License-CC%20By%20NC%204.0-red.svg)]()
|
|
|
|
|
24 |
|
25 |
## Checkpoints of Models Pre-Trained with InfoDCL
|
26 |
+
We further pretrained XLMR/RoBERTa with InfoDCL framework by ([Zhang et al. 2023](https://aclanthology.org/2023.findings-acl.152/))
|
|
|
|
|
27 |
|
28 |
Multilingual Model:
|
29 |
* InfoDCL-XLMR trained with multilingual TweetEmoji-multi: https://huggingface.co/UBC-NLP/InfoDCL-Emoji-XLMR-Base
|
30 |
|
31 |
+
Enlish Models:
|
32 |
+
* InfoDCL-RoBERTa trained with TweetEmoji-EN: https://huggingface.co/UBC-NLP/InfoDCL-emoji
|
33 |
+
* InfoDCL-RoBERTa trained with TweetHashtag-EN: https://huggingface.co/UBC-NLP/InfoDCL-hashtag
|
34 |
|
35 |
+
## Citation
|
36 |
+
Please cite us if you find our data or models useful.
|
37 |
+
```bibtex
|
38 |
+
@inproceedings{zhang-etal-2023-skipped,
|
39 |
+
title = "The Skipped Beat: A Study of Sociopragmatic Understanding in LLMs for 64 Languages",
|
40 |
+
author = "Zhang, Chiyu and
|
41 |
+
Khai Duy Doan and,
|
42 |
+
Qisheng Liao and,
|
43 |
+
Abdul-Mageed, Muhammad",
|
44 |
+
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
|
45 |
+
year = "2023",
|
46 |
+
publisher = "Association for Computational Linguistics",
|
47 |
+
}
|
48 |
+
```
|