Does this Phi3 onnx model support function calling?
#16
by
soulyet
- opened
I use Phi-3-mini-4k-instruct-onnx (directML) for local LLM provider, and also use Semantic Kernel to support function calling. but looks like no function was called. so I want to know if it is model related or other else issues.
It may be an issue with how the ONNX model is integrated in Semantic Kernel. You can raise a GitHub issue in the Semantic Kernel repo for more assistance.
kvaishnavi
changed discussion status to
closed