You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have been investigating using Ollama to work with an LLM locally. I have created an ASP Net Core Web API app that exposes plugin methods and I use ImportPluginFromOpenApiAsync to interact with the plugins.
Everything works as expected when using AzureOpenAIChatCompletion. When using AddOllamaChatCompletion the plugins are not getting hit when loaded by ImportPluginFromOpenApiAsync. If I load the plugin using kernel.Plugins.AddFromType the plugin method is hit as expected.
Note Ollama chat completeion is added as below:
builder.AddOllamaChatCompletion("qwen2", endpoint: new Uri("http://localhost:11434"));
Is using ImportPluginFromOpenApiAsync supported when using Ollama?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi,
I have been investigating using Ollama to work with an LLM locally. I have created an ASP Net Core Web API app that exposes plugin methods and I use ImportPluginFromOpenApiAsync to interact with the plugins.
Everything works as expected when using AzureOpenAIChatCompletion. When using AddOllamaChatCompletion the plugins are not getting hit when loaded by ImportPluginFromOpenApiAsync. If I load the plugin using kernel.Plugins.AddFromType the plugin method is hit as expected.
Note Ollama chat completeion is added as below:
builder.AddOllamaChatCompletion("qwen2", endpoint: new Uri("http://localhost:11434"));
Is using ImportPluginFromOpenApiAsync supported when using Ollama?
Thanks,
Euan
Beta Was this translation helpful? Give feedback.
All reactions