Update README.md
Browse files
README.md
CHANGED
@@ -37,24 +37,30 @@ We're excited to unveil **Qwen2-VL**, the latest iteration of our Qwen-VL model,
|
|
37 |
<img src="http://qianwen-res.oss-accelerate-overseas.aliyuncs.com/Qwen2-VL/mrope.png" width="80%"/>
|
38 |
<p>
|
39 |
|
40 |
-
We have three models with 2, 7 and 72 billion parameters.
|
|
|
|
|
|
|
|
|
41 |
|
42 |
## Requirements
|
43 |
|
44 |
-
The code of Qwen2-VL has been in the latest Hugging
|
45 |
|
46 |
```
|
47 |
KeyError: 'qwen2_vl'
|
48 |
```
|
49 |
|
|
|
50 |
## Citation
|
51 |
|
52 |
If you find our work helpful, feel free to give us a cite.
|
53 |
|
54 |
```
|
55 |
@article{Qwen2-VL,
|
56 |
-
title={Qwen2-VL},
|
57 |
-
author={
|
|
|
58 |
year={2024}
|
59 |
}
|
60 |
|
|
|
37 |
<img src="http://qianwen-res.oss-accelerate-overseas.aliyuncs.com/Qwen2-VL/mrope.png" width="80%"/>
|
38 |
<p>
|
39 |
|
40 |
+
We have three models with 2, 7 and 72 billion parameters.
|
41 |
+
|
42 |
+
This repo contains the **pretrained** 7B Qwen2-VL model.
|
43 |
+
|
44 |
+
For more information, visit our [Blog](https://qwenlm.github.io/blog/qwen2-vl/) and [GitHub](https://github.com/QwenLM/Qwen2-VL).
|
45 |
|
46 |
## Requirements
|
47 |
|
48 |
+
The code of Qwen2-VL has been in the latest Hugging Face `transformers` and we advise you to install the latest version with command `pip install -U transformers`, or you might encounter the following error:
|
49 |
|
50 |
```
|
51 |
KeyError: 'qwen2_vl'
|
52 |
```
|
53 |
|
54 |
+
|
55 |
## Citation
|
56 |
|
57 |
If you find our work helpful, feel free to give us a cite.
|
58 |
|
59 |
```
|
60 |
@article{Qwen2-VL,
|
61 |
+
title={Qwen2-VL: Enhancing Vision-Language Model's Perception of the World at Any Resolution},
|
62 |
+
author={Peng Wang and Shuai Bai and Sinan Tan and Shijie Wang and Zhihao Fan and Jinze Bai and Keqin Chen and Xuejing Liu and Jialin Wang and Wenbin Ge and Yang Fan and Kai Dang and Mengfei Du and Xuancheng Ren and Rui Men and Dayiheng Liu and Chang Zhou and Jingren Zhou and Junyang Lin},
|
63 |
+
journal={arXiv preprint arXiv:2409.12191},
|
64 |
year={2024}
|
65 |
}
|
66 |
|