Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
Felix Marty
commited on
Commit
β’
171b6b3
1
Parent(s):
c20abfa
style
Browse files- onnx_export.py +3 -6
onnx_export.py
CHANGED
@@ -116,14 +116,11 @@ def convert(api: "HfApi", model_id: str, task: str, force: bool = False) -> Tupl
|
|
116 |
operations = convert_onnx(model_id, task, folder)
|
117 |
|
118 |
commit_description = f"""
|
119 |
-
|
120 |
-
add to this repository the model converted to ONNX.
|
121 |
|
122 |
-
|
123 |
-
You can find out more at [onnx.ai](https://onnx.ai/)!
|
124 |
|
125 |
-
|
126 |
-
with π€ Optimum through ONNX Runtime, check out how [here](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models)!
|
127 |
"""
|
128 |
new_pr = api.create_commit(
|
129 |
repo_id=model_id,
|
|
|
116 |
operations = convert_onnx(model_id, task, folder)
|
117 |
|
118 |
commit_description = f"""
|
119 |
+
Beep boop I am the [ONNX export bot π€ποΈ]({SPACES_URL}). On behalf of [{requesting_user}](https://huggingface.co/{requesting_user}), I would like to add to this repository the model converted to ONNX.
|
|
|
120 |
|
121 |
+
What is ONNX? It stands for "Open Neural Network Exchange", and is the most commonly used open standard for machine learning interoperability. You can find out more at [onnx.ai](https://onnx.ai/)!
|
|
|
122 |
|
123 |
+
The exported ONNX model can be then be consumed by various backends as TensorRT or TVM, or simply be used in a few lines with π€ Optimum through ONNX Runtime, check out how [here](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models)!
|
|
|
124 |
"""
|
125 |
new_pr = api.create_commit(
|
126 |
repo_id=model_id,
|