How to use Semantic Kernel agents with Azure AI Foundry connected resource OpenAI models (not base models) #12951
SamAshmori
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
In my Foundry project, I have Azure OpenAI models added as connected resources, not as deployed “base models” in the Foundry Models tab.
When using Foundry base models, I can add them to SK agents via the
AddAzureAIInferenceChatCompletion
and the Foundry project endpoint (…/models
). That works fine.However, for connected Azure OpenAI resources, Foundry provides the method:
This returns an Azure.AI.OpenAI.ChatClient that does go through Foundry (so I still get analytics), but I can’t find a straightforward way to register this in SK’s agent architecture as a IChatCompletionService.
Is there a way to have chat completions via Foundry connected resources in Semantic Kernel while keeping analytics?
Beta Was this translation helpful? Give feedback.
All reactions