Update README.md
Browse files
README.md
CHANGED
@@ -2,14 +2,16 @@
|
|
2 |
license: apache-2.0
|
3 |
pipeline_tag: text-generation
|
4 |
tags:
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
12 |
inference: false
|
|
|
|
|
13 |
---
|
14 |
|
15 |
# Mistral-7B-Instruct-v0.3 ONNX
|
@@ -90,4 +92,4 @@ python phi3-qa.py -m .\mistral-7b-instruct-v0.3
|
|
90 |
- **Model type:** ONNX
|
91 |
- **Language(s) (NLP):** Python, C, C++
|
92 |
- **License:** Apache License Version 2.0
|
93 |
-
- **Model Description:** This model is a conversion of the Mistral-7B-Instruct-v0.3 for ONNX Runtime inference, optimized for CPU and DirectML.
|
|
|
2 |
license: apache-2.0
|
3 |
pipeline_tag: text-generation
|
4 |
tags:
|
5 |
+
- ONNX
|
6 |
+
- DML
|
7 |
+
- DirectML
|
8 |
+
- ONNXRuntime
|
9 |
+
- mistral
|
10 |
+
- conversational
|
11 |
+
- custom_code
|
12 |
inference: false
|
13 |
+
language:
|
14 |
+
- en
|
15 |
---
|
16 |
|
17 |
# Mistral-7B-Instruct-v0.3 ONNX
|
|
|
92 |
- **Model type:** ONNX
|
93 |
- **Language(s) (NLP):** Python, C, C++
|
94 |
- **License:** Apache License Version 2.0
|
95 |
+
- **Model Description:** This model is a conversion of the Mistral-7B-Instruct-v0.3 for ONNX Runtime inference, optimized for CPU and DirectML.
|