nehcgs commited on
Commit
cc96ba8
·
verified ·
1 Parent(s): ac96134

Upload folder using huggingface_hub

Browse files
Arch-Function-7B-Q2_K.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b7d99fba5587d6ca3fc4b64dee92394318e3b9bdaed95cee9320218d94514c00
3
  size 3015940032
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f5ea9215f0404f71e7b33979aa4d8cef69826e25474760295496457db13368f8
3
  size 3015940032
Arch-Function-7B-Q3_K_L.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:765fa9f52b43fd02c99cdcf5c712e3dd68142586e494759f892d30bd3f34b61c
3
  size 4088459200
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c94b7bc3eca39d27c43fa42b58ff633e8972fa5aaa127a55e7326d387227e9e7
3
  size 4088459200
Arch-Function-7B-Q3_K_M.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e82c403fd77a776dfddf70037dbab2c8e508020a16787680c3a9c59ebc0762d5
3
  size 3808391104
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f218e1b088ed83897b9d1388beb91e28c7f16b164d17cae624ef2a80e625ebbf
3
  size 3808391104
Arch-Function-7B-Q3_K_S.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f45212daa729247becd59fadb7d833837e61ad80bbfe98ef355c6042b72e1c70
3
  size 3492368320
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f600be324a7e4e7cc72367ec55f811eded20c5e3814589cf5624df377257e783
3
  size 3492368320
Arch-Function-7B-Q4_K_M.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5ee1ca0e51f461d671d60d97f262de1eefd613f4fc3a140134c6c256abb79840
3
  size 4683073472
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3e65120a99b4b668f367403a8e557b03c8764525b874293478eb3bde27808d9f
3
  size 4683073472
Arch-Function-7B-Q4_K_S.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1828ec662224307bfa145bd0a098889ff45feb7123fb9b117befdecc42a711b8
3
  size 4457768896
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dbcc05e2ca5b9c1dcc3373dd065862f0ef1773f28a85e12f04a30626192bb126
3
  size 4457768896
Arch-Function-7B-Q5_K_M.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9f5d61c2d30cbd8f157a5ca42f4b5941c9d1434f99bacda0b71c4ef78304e868
3
  size 5444831168
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3f7e356856e0e408467a40a2c4fbffe82485fdb9bce358ddeac92bda36b252f0
3
  size 5444831168
Arch-Function-7B-Q5_K_S.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:d48953bddd573b2b4aade027c36c2268f8da6f337ab26b5863cee24edeb02819
3
  size 5315176384
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7626c5ac58d2f63d5b791dbdd371edc12d78eeb288582ce5ccce82748d366347
3
  size 5315176384
Arch-Function-7B-Q6_K.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:147cf5039702222c5a25676907efbd6718a4214dc68a7dace16d400b09013ec9
3
  size 6254198720
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:223ed55a7b1113b7a1c097dfea61ddeee88ba8dd4e4b97c72e3841a019433594
3
  size 6254198720
Arch-Function-7B.gguf CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1a164d8195e8dd3605d94a986bb74be2806333547c767dd2603394417b59b743
3
  size 15237853120
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4ec84be2a8528b0507508e8cf36f6b76228262122e3f7e2fcc5af2b86fb3ba8c
3
  size 15237853120
README.md CHANGED
@@ -1,16 +1,17 @@
1
  ---
2
  license: other
3
  license_name: katanemo-research
4
- license_link: https://huggingface.co/katanemolabs/Arch-Function-7B.gguf/blob/main/LICENSE
 
5
  base_model:
6
- - katanemo/Arch-Function-7B
7
  language:
8
  - en
9
  pipeline_tag: text-generation
10
  library_name: transformers
11
  ---
12
 
13
- # katanemo/Arch-Function-7B
14
 
15
  ## Overview
16
  The Katanemo Arch-Function collection of large language models (LLMs) is a collection state-of-the-art (SOTA) LLMs specifically designed for **function calling** tasks. The models are designed to understand complex function signatures, identify required parameters, and produce accurate function call outputs based on natural language prompts. Achieving performance on par with GPT-4, these models set a new benchmark in the domain of function-oriented tasks, making them suitable for scenarios where automated API interaction and function execution is crucial.
@@ -53,7 +54,7 @@ Katanemo Arch-Function collection is built on top of the [Qwen 2.5](https://hugg
53
 
54
 
55
  ## Performance Benchmarks
56
- We evaluate Katanemo Arch-Function series on the [Berkeley Function-Calling Leaderboard (BFCL)](https://gorilla.cs.berkeley.edu/leaderboard.html#leaderboard). For each model family, we select the one with the highest rank. The results are shwon below:
57
 
58
  <table>
59
  <tr style="text-align: center; vertical-align: middle; font-weight: bold;">
@@ -74,109 +75,186 @@ We evaluate Katanemo Arch-Function series on the [Berkeley Function-Calling Lead
74
  </tr>
75
  <tr style="text-align: center; vertical-align: middle;">
76
  <td>1</td>
77
- <td>GPT-4-turbo-2024-04-09</td>
78
- <td>59.49%</td>
79
- <td>82.65%</td>
80
- <td>83.80%</td>
81
- <td>73.39%</td>
82
- <td>21.62%</td>
 
 
 
 
 
 
 
 
 
 
 
83
  <td>70.73%</td>
84
- <td>79.79%</td>
85
  </tr>
86
  <tr style="text-align: center; vertical-align: middle;">
87
- <td>3</td>
88
- <td>xLAM-8x22b-r</td>
89
- <td>59.13%</td>
90
- <td>89.75%</td>
91
- <td>89.32%</td>
92
- <td>72.81%</td>
93
- <td>15.62%</td>
94
- <td>97.56%</td>
95
- <td>75.23%</td>
 
 
 
 
 
 
 
 
 
 
 
96
  </tr>
97
  <tr style="text-align: center; vertical-align: middle; font-weight: bold;">
98
  <td> </td>
99
  <td>Arch-Function-7B</td>
100
- <td>57.48%</td>
101
- <td>87.50%</td>
102
- <td>86.80%</td>
103
- <td>72.19%</td>
104
- <td>13.75%</td>
105
- <td>82.93%</td>
106
- <td>79.54%</td>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
107
  </tr>
108
  <tr style="text-align: center; vertical-align: middle; font-weight: bold;">
109
  <td> </td>
110
  <td>Arch-Function-3B</td>
111
- <td>56.23%</td>
112
- <td>85.10%</td>
113
- <td>89.16%</td>
114
- <td>70.72%</td>
115
- <td>12.28%</td>
116
- <td>90.24%</td>
117
- <td>73.98%</td>
118
- </tr>
119
- <tr style="text-align: center; vertical-align: middle;">
120
- <td>7</td>
121
- <td>mistral-large-2407</td>
122
- <td>55.82%</td>
123
- <td>84.12%</td>
124
- <td>83.09%</td>
125
- <td>67.17%</td>
126
- <td>20.50%</td>
127
- <td>78.05%</td>
128
- <td>48.93%</td>
129
- </tr>
130
- <tr style="text-align: center; vertical-align: middle;">
131
- <td>9</td>
132
- <td>Claude-3.5-Sonnet-20240620</td>
133
- <td>54.83%</td>
134
- <td>70.35%</td>
135
- <td>66.34%</td>
136
- <td>71.39%</td>
137
- <td>23.5%</td>
138
- <td>63.41%</td>
139
- <td>75.91%</td>
140
  </tr>
141
  </tr>
142
  <tr style="text-align: center; vertical-align: middle; font-weight: bold;">
143
  <td> </td>
144
  <td>Arch-Function-1.5B</td>
145
- <td>53.61%</td>
146
- <td>82.60%</td>
147
- <td>87.36%</td>
148
- <td>68.19%</td>
149
- <td>8.62%</td>
150
- <td>87.80%</td>
151
- <td>75.90%</td>
152
  </tr>
153
- <tr style="text-align: center; vertical-align: middle;">
154
- <td>11</td>
155
- <td>o1-mini-2024-09-12</td>
156
- <td>53.43%</td>
157
- <td>75.48%</td>
158
- <td>76.86%</td>
159
- <td>71.17%</td>
160
- <td>11.00%</td>
161
- <td>46.34%</td>
162
- <td>88.07%</td>
163
  </tr>
164
- <tr style="text-align: center; vertical-align: middle;">
165
- <td>12</td>
166
- <td>Gemini-1.5-Flash-Preview-0514</td>
167
- <td>53.01%</td>
168
- <td>77.10%</td>
169
- <td>71.23%</td>
170
- <td>71.17%</td>
171
- <td>13.12%</td>
172
- <td>60.98%</td>
173
- <td>76.15%</td>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
174
  </tr>
175
  </table>
176
 
177
 
178
  # Requirements
179
- The code of Arch-Function-7B has been in the Hugging Face `transformers` library and we advise you to install latest version:
180
  ```bash
181
  pip install transformers>=4.37.0
182
  ```
@@ -192,7 +270,7 @@ import json
192
  from typing import Any, Dict, List
193
  from transformers import AutoModelForCausalLM, AutoTokenizer
194
 
195
- model_name = "katanemo/Arch-Function-7B"
196
  model = AutoModelForCausalLM.from_pretrained(
197
  model_name, device_map="auto", torch_dtype="auto", trust_remote_code=True
198
  )
@@ -331,4 +409,4 @@ The current temperature in Seattle is 62 degrees in Fahrenheit.
331
 
332
 
333
  # License
334
- Katanemo Arch-Function collection is distributed under the [Katanemo license](https://huggingface.co/katanemolabs/Arch-Function-7B.gguf/blob/main/LICENSE).
 
1
  ---
2
  license: other
3
  license_name: katanemo-research
4
+ license_link: >-
5
+ https://huggingface.co/katanemolabs/Arch-Function-1.5B/blob/main/LICENSE
6
  base_model:
7
+ - Qwen/Qwen2.5-1.5B-Instruct
8
  language:
9
  - en
10
  pipeline_tag: text-generation
11
  library_name: transformers
12
  ---
13
 
14
+ # katanemo/Arch-Function-1.5B
15
 
16
  ## Overview
17
  The Katanemo Arch-Function collection of large language models (LLMs) is a collection state-of-the-art (SOTA) LLMs specifically designed for **function calling** tasks. The models are designed to understand complex function signatures, identify required parameters, and produce accurate function call outputs based on natural language prompts. Achieving performance on par with GPT-4, these models set a new benchmark in the domain of function-oriented tasks, making them suitable for scenarios where automated API interaction and function execution is crucial.
 
54
 
55
 
56
  ## Performance Benchmarks
57
+ We evaluate Katanemo Arch-Function series on the [Berkeley Function-Calling Leaderboard (BFCL)](https://gorilla.cs.berkeley.edu/leaderboard.html#leaderboard). For each model family, we select the one with the highest rank. The results (as of Oct 21st, 2024) are shwon below:
58
 
59
  <table>
60
  <tr style="text-align: center; vertical-align: middle; font-weight: bold;">
 
75
  </tr>
76
  <tr style="text-align: center; vertical-align: middle;">
77
  <td>1</td>
78
+ <td>GPT-4o-2024-08-06 (FC)</td>
79
+ <td>62.19%</td>
80
+ <td>85.90%</td>
81
+ <td>85.64%</td>
82
+ <td>75.43%</td>
83
+ <td>25.00%</td>
84
+ <td>63.41%</td>
85
+ <td>82.93%</td>
86
+ </tr>
87
+ <tr style="text-align: center; vertical-align: middle;">
88
+ <td>2</td>
89
+ <td>Functionary-Medium-v3.1 (FC)</td>
90
+ <td>62.02%</td>
91
+ <td>89.52%</td>
92
+ <td>89.77%</td>
93
+ <td>73.48%</td>
94
+ <td>23.50%</td>
95
  <td>70.73%</td>
96
+ <td>73.32%</td>
97
  </tr>
98
  <tr style="text-align: center; vertical-align: middle;">
99
+ <td>5</td>
100
+ <td>ToolACE-8B (FC)</td>
101
+ <td>60.44%</td>
102
+ <td>87.06%</td>
103
+ <td>89.52%</td>
104
+ <td>74.99%</td>
105
+ <td>17.38%</td>
106
+ <td>80.49%</td>
107
+ <td>85.71%</td>
108
+ </tr>
109
+ <tr style="text-align: center; vertical-align: middle;">
110
+ <td>6</td>
111
+ <td>o1-preview-2024-09-12 (Prompt)</td>
112
+ <td>59.27%</td>
113
+ <td>86.42%</td>
114
+ <td>88.88%</td>
115
+ <td>73.08%</td>
116
+ <td>17.62%</td>
117
+ <td>73.17%</td>
118
+ <td>74.60%</td>
119
  </tr>
120
  <tr style="text-align: center; vertical-align: middle; font-weight: bold;">
121
  <td> </td>
122
  <td>Arch-Function-7B</td>
123
+ <td>58.44%</td>
124
+ <td>85.58%</td>
125
+ <td>88.14%</td>
126
+ <td>69.08%</td>
127
+ <td>20.50%</td>
128
+ <td>92.68%</td>
129
+ <td>74.05%</td>
130
+ </tr>
131
+ <tr style="text-align: center; vertical-align: middle; ">
132
+ <td>8</td>
133
+ <td>xLAM-8x22b-r (FC)</td>
134
+ <td>57.99%</td>
135
+ <td>88.15%</td>
136
+ <td>90.11%</td>
137
+ <td>71.97%</td>
138
+ <td>14.50%</td>
139
+ <td>85.37%</td>
140
+ <td>67.29%</td>
141
+ </tr>
142
+ <tr style="text-align: center; vertical-align: middle; ">
143
+ <td>9</td>
144
+ <td>Gemini-1.5-Flash-002 (Prompt)</td>
145
+ <td>57.92%</td>
146
+ <td>86.58%</td>
147
+ <td>89.48%</td>
148
+ <td>76.28%</td>
149
+ <td>9.88%</td>
150
+ <td>85.37%</td>
151
+ <td>78.54%</td>
152
+ </tr>
153
+ <tr style="text-align: center; vertical-align: middle; ">
154
+ <td>10</td>
155
+ <td>Hammer2.0-7b (FC)</td>
156
+ <td>57.69%</td>
157
+ <td>90.27%</td>
158
+ <td>89.25%</td>
159
+ <td>69.79%</td>
160
+ <td>14.75%</td>
161
+ <td>95.12%</td>
162
+ <td>68.46%</td>
163
+ </tr>
164
+ <tr style="text-align: center; vertical-align: middle; ">
165
+ <td>12</td>
166
+ <td>Claude-3.5-Sonnet-20240620 (FC)</td>
167
+ <td>57.42%</td>
168
+ <td>70.04%</td>
169
+ <td>66.27%</td>
170
+ <td>74.68%</td>
171
+ <td>28.38%</td>
172
+ <td>68.29%</td>
173
+ <td>74.58%</td>
174
+ </tr>
175
+ <tr style="text-align: center; vertical-align: middle; ">
176
+ <td>13</td>
177
+ <td>mistral-large-2407 (FC)</td>
178
+ <td>56.80%</td>
179
+ <td>86.62%</td>
180
+ <td>84.57%</td>
181
+ <td>68.37%</td>
182
+ <td>20.62%</td>
183
+ <td>75.61%</td>
184
+ <td>49.44%</td>
185
  </tr>
186
  <tr style="text-align: center; vertical-align: middle; font-weight: bold;">
187
  <td> </td>
188
  <td>Arch-Function-3B</td>
189
+ <td>56.57%</td>
190
+ <td>83.62%</td>
191
+ <td>85.36%</td>
192
+ <td>66.90%</td>
193
+ <td>19.50%</td>
194
+ <td>97.56%</td>
195
+ <td>70.99%</td>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
196
  </tr>
197
  </tr>
198
  <tr style="text-align: center; vertical-align: middle; font-weight: bold;">
199
  <td> </td>
200
  <td>Arch-Function-1.5B</td>
201
+ <td>54.52%</td>
202
+ <td>80.31%</td>
203
+ <td>82.04%</td>
204
+ <td>66.19%</td>
205
+ <td>17.25%</td>
206
+ <td>97.56%</td>
207
+ <td>69.95%</td>
208
  </tr>
209
+ <tr style="text-align: center; vertical-align: middle; ">
210
+ <td>19</td>
211
+ <td>xLAM-7b-r (FC)</td>
212
+ <td>54.41%</td>
213
+ <td>81.40%</td>
214
+ <td>83.46%</td>
215
+ <td>67.88%</td>
216
+ <td>14.50%</td>
217
+ <td>97.56%</td>
218
+ <td>64.05%</td>
219
  </tr>
220
+ <tr style="text-align: center; vertical-align: middle; ">
221
+ <td>20</td>
222
+ <td>Qwen2.5-7B-Instruct (Prompt)</td>
223
+ <td>54.27%</td>
224
+ <td>85.79%</td>
225
+ <td>88.13%</td>
226
+ <td>65.97%</td>
227
+ <td>11.25%</td>
228
+ <td>92.68%</td>
229
+ <td>64.95%</td>
230
+ </tr>
231
+ <tr style="text-align: center; vertical-align: middle; ">
232
+ <td>21</td>
233
+ <td>Llama-3.1-70B-Instruct (Prompt)</td>
234
+ <td>53.67%</td>
235
+ <td>88.90%</td>
236
+ <td>89.34%</td>
237
+ <td>61.13%</td>
238
+ <td>12.38%</td>
239
+ <td>92.68%</td>
240
+ <td>58.38%</td>
241
+ </tr>
242
+ <tr style="text-align: center; vertical-align: middle; ">
243
+ <td>22</td>
244
+ <td>Gemma-2-27b-it (Prompt)</td>
245
+ <td>53.66%</td>
246
+ <td>88.52%</td>
247
+ <td>87.89%</td>
248
+ <td>69.48%</td>
249
+ <td>4.12%</td>
250
+ <td>87.8%</td>
251
+ <td>68.76%</td>
252
  </tr>
253
  </table>
254
 
255
 
256
  # Requirements
257
+ The code of Arch-Function-1.5B has been in the Hugging Face `transformers` library and we advise you to install latest version:
258
  ```bash
259
  pip install transformers>=4.37.0
260
  ```
 
270
  from typing import Any, Dict, List
271
  from transformers import AutoModelForCausalLM, AutoTokenizer
272
 
273
+ model_name = "katanemo/Arch-Function-1.5B"
274
  model = AutoModelForCausalLM.from_pretrained(
275
  model_name, device_map="auto", torch_dtype="auto", trust_remote_code=True
276
  )
 
409
 
410
 
411
  # License
412
+ Katanemo Arch-Function collection is distributed under the [Katanemo license](https://huggingface.co/katanemolabs/Arch-Function-1.5B/blob/main/LICENSE).