Describe the bug
When doing an agent.InvokeAsync call against the o1 model (version:2024-12-17) in Azure Open AI service you get error:
"HTTP 400 (BadRequest)\r\n\r\nModel o1 is enabled only for api versions 2024-12-01-preview and later"
To Reproduce
Steps to reproduce the behavior:
- Make a kernel chat completion service that target o1 as deployment model
- Make an agent (with instructions, temperature and structured output)
- Call agent.InvokeAsync
Expected behavior
A normal LLM respose (in this case json due to structured output)
Platform
- OS: Windows
- IDE: Visual Studio
- Language: C#
- Source: NuGet package version 1.33.0
Additional context
Issue is properly the transative use of the azure.ai.openai beta2 package. I tried manually bumping this to latest, but it gave an feature not implemented exception
Describe the bug
When doing an agent.InvokeAsync call against the o1 model (version:2024-12-17) in Azure Open AI service you get error:
"HTTP 400 (BadRequest)\r\n\r\nModel o1 is enabled only for api versions 2024-12-01-preview and later"To Reproduce
Steps to reproduce the behavior:
Expected behavior
A normal LLM respose (in this case json due to structured output)
Platform
Additional context
Issue is properly the transative use of the azure.ai.openai beta2 package. I tried manually bumping this to latest, but it gave an feature not implemented exception