Skip to content

Commit 5fb4b3b

Browse files
kunalk16Copilot
andauthored
feat(provider): add support for azure openai provider (#1422)
* Add support for azure openai provider * Add checks for deployment model name * Apply suggestion from @Copilot Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Addressing @Copilot suggestion to remove the init() function which seemed redundant * Fix readme * Fix linting checks --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
1 parent 0f700a6 commit 5fb4b3b

File tree

16 files changed

+1446
-323
lines changed

16 files changed

+1446
-323
lines changed

README.fr.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -991,6 +991,7 @@ Cette conception permet également le **support multi-agent** avec une sélectio
991991
| **BytePlus** | `byteplus/` | `https://ark.ap-southeast.bytepluses.com/api/v3` | OpenAI | [Obtenir Clé](https://www.byteplus.com/) |
992992
| **LongCat** | `longcat/` | `https://api.longcat.chat/openai` | OpenAI | [Obtenir une clé](https://longcat.chat/platform) |
993993
| **ModelScope (魔搭)**| `modelscope/` | `https://api-inference.modelscope.cn/v1` | OpenAI | [Obtenir un Token](https://modelscope.cn/my/tokens) |
994+
| **Azure OpenAI** | `azure/` | `https://{resource}.openai.azure.com` | Azure | [Obtenir Clé](https://portal.azure.com) |
994995
| **Antigravity** | `antigravity/` | Google Cloud | Custom | OAuth uniquement |
995996
| **GitHub Copilot** | `github-copilot/` | `localhost:4321` | gRPC | - |
996997

README.ja.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -935,6 +935,7 @@ HEARTBEAT_OK 応答 ユーザーが直接結果を受け取る
935935
| **BytePlus** | `byteplus/` | `https://ark.ap-southeast.bytepluses.com/api/v3` | OpenAI | [キーを取得](https://www.byteplus.com) |
936936
| **LongCat** | `longcat/` | `https://api.longcat.chat/openai` | OpenAI | [キーを取得](https://longcat.chat/platform) |
937937
| **ModelScope (魔搭)**| `modelscope/` | `https://api-inference.modelscope.cn/v1` | OpenAI | [トークンを取得](https://modelscope.cn/my/tokens) |
938+
| **Azure OpenAI** | `azure/` | `https://{resource}.openai.azure.com` | Azure | [キーを取得](https://portal.azure.com) |
938939
| **Antigravity** | `antigravity/` | Google Cloud | カスタム | OAuthのみ |
939940
| **GitHub Copilot** | `github-copilot/` | `localhost:4321` | gRPC | - |
940941

README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1006,6 +1006,7 @@ The subagent has access to tools (message, web_search, etc.) and can communicate
10061006
| `groq` | LLM + **Voice transcription** (Whisper) | [console.groq.com](https://console.groq.com) |
10071007
| `cerebras` | LLM (Cerebras direct) | [cerebras.ai](https://cerebras.ai) |
10081008
| `vivgrid` | LLM (Vivgrid direct) | [vivgrid.com](https://vivgrid.com) |
1009+
| `azure` | LLM (Azure OpenAI) | [portal.azure.com](https://portal.azure.com) |
10091010

10101011
### Model Configuration (model_list)
10111012

@@ -1042,6 +1043,7 @@ This design also enables **multi-agent support** with flexible provider selectio
10421043
| **Vivgrid** | `vivgrid/` | `https://api.vivgrid.com/v1` | OpenAI | [Get Key](https://vivgrid.com) |
10431044
| **LongCat** | `longcat/` | `https://api.longcat.chat/openai` | OpenAI | [Get Key](https://longcat.chat/platform) |
10441045
| **ModelScope (魔搭)**| `modelscope/` | `https://api-inference.modelscope.cn/v1` | OpenAI | [Get Token](https://modelscope.cn/my/tokens) |
1046+
| **Azure OpenAI** | `azure/` | `https://{resource}.openai.azure.com` | Azure | [Get Key](https://portal.azure.com) |
10451047
| **Antigravity** | `antigravity/` | Google Cloud | Custom | OAuth only |
10461048
| **GitHub Copilot** | `github-copilot/` | `localhost:4321` | gRPC | - |
10471049

README.pt-br.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -987,6 +987,7 @@ Este design também possibilita o **suporte multi-agent** com seleção flexíve
987987
| **BytePlus** | `byteplus/` | `https://ark.ap-southeast.bytepluses.com/api/v3` | OpenAI | [Obter Chave](https://www.byteplus.com) |
988988
| **LongCat** | `longcat/` | `https://api.longcat.chat/openai` | OpenAI | [Obter Chave](https://longcat.chat/platform) |
989989
| **ModelScope (魔搭)**| `modelscope/` | `https://api-inference.modelscope.cn/v1` | OpenAI | [Obter Token](https://modelscope.cn/my/tokens) |
990+
| **Azure OpenAI** | `azure/` | `https://{resource}.openai.azure.com` | Azure | [Obter Chave](https://portal.azure.com) |
990991
| **Antigravity** | `antigravity/` | Google Cloud | Custom | Apenas OAuth |
991992
| **GitHub Copilot** | `github-copilot/` | `localhost:4321` | gRPC | - |
992993

README.vi.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -956,6 +956,7 @@ Thiết kế này cũng cho phép **hỗ trợ đa tác nhân** với lựa ch
956956
| **BytePlus** | `byteplus/` | `https://ark.ap-southeast.bytepluses.com/api/v3` | OpenAI | [Lấy Khóa](https://www.byteplus.com) |
957957
| **LongCat** | `longcat/` | `https://api.longcat.chat/openai` | OpenAI | [Lấy Key](https://longcat.chat/platform) |
958958
| **ModelScope (魔搭)**| `modelscope/` | `https://api-inference.modelscope.cn/v1` | OpenAI | [Lấy Token](https://modelscope.cn/my/tokens) |
959+
| **Azure OpenAI** | `azure/` | `https://{resource}.openai.azure.com` | Azure | [Lấy Khóa](https://portal.azure.com) |
959960
| **Antigravity** | `antigravity/` | Google Cloud | Tùy chỉnh | Chỉ OAuth |
960961
| **GitHub Copilot** | `github-copilot/` | `localhost:4321` | gRPC | - |
961962

README.zh.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -528,6 +528,7 @@ Agent 读取 HEARTBEAT.md
528528
| **BytePlus** | `byteplus/` | `https://ark.ap-southeast.bytepluses.com/api/v3` | OpenAI | [获取密钥](https://www.byteplus.com) |
529529
| **LongCat** | `longcat/` | `https://api.longcat.chat/openai` | OpenAI | [获取密钥](https://longcat.chat/platform) |
530530
| **ModelScope (魔搭)**| `modelscope/` | `https://api-inference.modelscope.cn/v1` | OpenAI | [获取 Token](https://modelscope.cn/my/tokens) |
531+
| **Azure OpenAI** | `azure/` | `https://{resource}.openai.azure.com` | Azure | [获取密钥](https://portal.azure.com) |
531532
| **Antigravity** | `antigravity/` | Google Cloud | 自定义 | 仅 OAuth |
532533
| **GitHub Copilot** | `github-copilot/` | `localhost:4321` | gRPC | - |
533534

config/config.example.json

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -53,6 +53,12 @@
5353
"api_key": "your-modelscope-access-token",
5454
"api_base": "https://api-inference.modelscope.cn/v1"
5555
},
56+
{
57+
"model_name": "azure-gpt5",
58+
"model": "azure/my-gpt5-deployment",
59+
"api_key": "your-azure-api-key",
60+
"api_base": "https://your-resource.openai.azure.com"
61+
},
5662
{
5763
"model_name": "loadbalanced-gpt-5.4",
5864
"model": "openai/gpt-5.4",

pkg/config/defaults.go

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -384,6 +384,15 @@ func DefaultConfig() *Config {
384384
APIBase: "http://localhost:8000/v1",
385385
APIKey: "",
386386
},
387+
388+
// Azure OpenAI - https://portal.azure.com
389+
// model_name is a user-friendly alias; the model field's path after "azure/" is your deployment name
390+
{
391+
ModelName: "azure-gpt5",
392+
Model: "azure/my-gpt5-deployment",
393+
APIBase: "https://your-resource.openai.azure.com",
394+
APIKey: "",
395+
},
387396
},
388397
Gateway: GatewayConfig{
389398
Host: "127.0.0.1",

pkg/providers/azure/provider.go

Lines changed: 150 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,150 @@
1+
package azure
2+
3+
import (
4+
"bytes"
5+
"context"
6+
"encoding/json"
7+
"fmt"
8+
"net/http"
9+
"net/url"
10+
"strings"
11+
"time"
12+
13+
"github.com/sipeed/picoclaw/pkg/providers/common"
14+
"github.com/sipeed/picoclaw/pkg/providers/protocoltypes"
15+
)
16+
17+
type (
18+
LLMResponse = protocoltypes.LLMResponse
19+
Message = protocoltypes.Message
20+
ToolDefinition = protocoltypes.ToolDefinition
21+
)
22+
23+
const (
24+
// azureAPIVersion is the Azure OpenAI API version used for all requests.
25+
azureAPIVersion = "2024-10-21"
26+
defaultRequestTimeout = common.DefaultRequestTimeout
27+
)
28+
29+
// Provider implements the LLM provider interface for Azure OpenAI endpoints.
30+
// It handles Azure-specific authentication (api-key header), URL construction
31+
// (deployment-based), and request body formatting (max_completion_tokens, no model field).
32+
type Provider struct {
33+
apiKey string
34+
apiBase string
35+
httpClient *http.Client
36+
}
37+
38+
// Option configures the Azure Provider.
39+
type Option func(*Provider)
40+
41+
// WithRequestTimeout sets the HTTP request timeout.
42+
func WithRequestTimeout(timeout time.Duration) Option {
43+
return func(p *Provider) {
44+
if timeout > 0 {
45+
p.httpClient.Timeout = timeout
46+
}
47+
}
48+
}
49+
50+
// NewProvider creates a new Azure OpenAI provider.
51+
func NewProvider(apiKey, apiBase, proxy string, opts ...Option) *Provider {
52+
p := &Provider{
53+
apiKey: apiKey,
54+
apiBase: strings.TrimRight(apiBase, "/"),
55+
httpClient: common.NewHTTPClient(proxy),
56+
}
57+
58+
for _, opt := range opts {
59+
if opt != nil {
60+
opt(p)
61+
}
62+
}
63+
64+
return p
65+
}
66+
67+
// NewProviderWithTimeout creates a new Azure OpenAI provider with a custom request timeout in seconds.
68+
func NewProviderWithTimeout(apiKey, apiBase, proxy string, requestTimeoutSeconds int) *Provider {
69+
return NewProvider(
70+
apiKey, apiBase, proxy,
71+
WithRequestTimeout(time.Duration(requestTimeoutSeconds)*time.Second),
72+
)
73+
}
74+
75+
// Chat sends a chat completion request to the Azure OpenAI endpoint.
76+
// The model parameter is used as the Azure deployment name in the URL.
77+
func (p *Provider) Chat(
78+
ctx context.Context,
79+
messages []Message,
80+
tools []ToolDefinition,
81+
model string,
82+
options map[string]any,
83+
) (*LLMResponse, error) {
84+
if p.apiBase == "" {
85+
return nil, fmt.Errorf("Azure API base not configured")
86+
}
87+
88+
// model is the deployment name for Azure OpenAI
89+
deployment := model
90+
91+
// Build Azure-specific URL safely using url.JoinPath and query encoding
92+
// to prevent path traversal or query injection via deployment names.
93+
base, err := url.JoinPath(p.apiBase, "openai/deployments", deployment, "chat/completions")
94+
if err != nil {
95+
return nil, fmt.Errorf("failed to build Azure request URL: %w", err)
96+
}
97+
requestURL := base + "?api-version=" + azureAPIVersion
98+
99+
// Build request body — no "model" field (Azure infers from deployment URL)
100+
requestBody := map[string]any{
101+
"messages": common.SerializeMessages(messages),
102+
}
103+
104+
if len(tools) > 0 {
105+
requestBody["tools"] = tools
106+
requestBody["tool_choice"] = "auto"
107+
}
108+
109+
// Azure OpenAI always uses max_completion_tokens
110+
if maxTokens, ok := common.AsInt(options["max_tokens"]); ok {
111+
requestBody["max_completion_tokens"] = maxTokens
112+
}
113+
114+
if temperature, ok := common.AsFloat(options["temperature"]); ok {
115+
requestBody["temperature"] = temperature
116+
}
117+
118+
jsonData, err := json.Marshal(requestBody)
119+
if err != nil {
120+
return nil, fmt.Errorf("failed to marshal request: %w", err)
121+
}
122+
123+
req, err := http.NewRequestWithContext(ctx, "POST", requestURL, bytes.NewReader(jsonData))
124+
if err != nil {
125+
return nil, fmt.Errorf("failed to create request: %w", err)
126+
}
127+
128+
// Azure uses api-key header instead of Authorization: Bearer
129+
req.Header.Set("Content-Type", "application/json")
130+
if p.apiKey != "" {
131+
req.Header.Set("api-key", p.apiKey)
132+
}
133+
134+
resp, err := p.httpClient.Do(req)
135+
if err != nil {
136+
return nil, fmt.Errorf("failed to send request: %w", err)
137+
}
138+
defer resp.Body.Close()
139+
140+
if resp.StatusCode != http.StatusOK {
141+
return nil, common.HandleErrorResponse(resp, p.apiBase)
142+
}
143+
144+
return common.ReadAndParseResponse(resp, p.apiBase)
145+
}
146+
147+
// GetDefaultModel returns an empty string as Azure deployments are user-configured.
148+
func (p *Provider) GetDefaultModel() string {
149+
return ""
150+
}

0 commit comments

Comments
 (0)