Skip to content

fix: use static anthropic oauth models#1000

Merged
zerob13 merged 1 commit intodevfrom
codex/update-anthropicprovider-to-return-static-models
Oct 14, 2025
Merged

fix: use static anthropic oauth models#1000
zerob13 merged 1 commit intodevfrom
codex/update-anthropicprovider-to-return-static-models

Conversation

@zerob13
Copy link
Copy Markdown
Collaborator

@zerob13 zerob13 commented Oct 14, 2025

Summary

  • avoid calling the Anthropic models API when OAuth authentication is active
  • return a predefined model list that mirrors the official response structure in OAuth mode

Testing

  • pnpm run typecheck:node
  • pnpm run typecheck:web

https://chatgpt.com/codex/tasks/task_e_68ee0889ae54832c8d3922a8b29c0953

Summary by CodeRabbit

  • Refactor

    • OAuth-based model listing for the Anthropic provider now uses a predefined set of supported Claude models, resulting in faster and more consistent model availability.
    • API key–based model listing remains unchanged.
  • Chores

    • Streamlined the OAuth model retrieval path to reduce variability and improve reliability in model selection.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Oct 14, 2025

Walkthrough

Replaces OAuth model retrieval in anthropicProvider.ts with a predefined in-code model list, bypassing the previous live /v1/models call. API Key mode behavior remains unchanged and continues using the Anthropics SDK for listing models.

Changes

Cohort / File(s) Summary
Anthropic OAuth model retrieval
src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
Added OAUTH_MODEL_LIST with Claude models and metadata. Modified fetchProviderModels to return this list in OAuth mode instead of calling /v1/models. API Key flow unchanged (still lists via SDK).

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant UI as UI
  participant Presenter as LLMProviderPresenter
  participant Provider as AnthropicProvider
  participant SDK as Anthropics SDK
  participant OAuthAPI as OAuth /v1/models (previous)

  UI->>Presenter: Request provider models
  Presenter->>Provider: fetchProviderModels(authMode)

  alt OAuth mode
    note over Provider: New behavior
    Provider-->>Presenter: Return OAUTH_MODEL_LIST (hardcoded)
    note right of Presenter: No network call
  else API Key mode
    note over Provider: Unchanged behavior
    Provider->>SDK: listModels()
    SDK-->>Provider: Models
    Provider-->>Presenter: Models
  end

  rect rgba(255,235,205,0.4)
  note over OAuthAPI: Previous OAuth flow (removed)
  Provider -x OAuthAPI: (was) GET /v1/models
  end

  Presenter-->>UI: Models
Loading

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Poem

A bunny taps keys with a caffeinated grin,
Swaps live calls for a basket built-in.
OAuth now nibbles a curated list,
API keys hop as they always did persist.
Thump-thump—models fetched without a sprint! 🐇✨

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title Check ✅ Passed The title succinctly describes the core change of using a static list for Anthropic OAuth models and follows the conventional commit style, making it clear and focused on the primary fix without unnecessary detail.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch codex/update-anthropicprovider-to-return-static-models

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts (1)

173-175: LGTM! Consider adding a comment explaining the OAuth behavior.

The change correctly implements the PR objective to avoid calling the Anthropic models API in OAuth mode by using the predefined static list.

Optionally, add a brief comment to explain the rationale:

     if (this.isOAuthMode) {
-      // OAuth mode: use predefined model list to avoid API calls
+      // OAuth mode: use predefined model list
+      // OAuth tokens may lack permission to access /v1/models endpoint
       models = OAUTH_MODEL_LIST
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 84d1a35 and 9f6822b.

📒 Files selected for processing (1)
  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts (2 hunks)
🧰 Additional context used
📓 Path-based instructions (11)
**/*.{js,jsx,ts,tsx}

📄 CodeRabbit inference engine (.cursor/rules/development-setup.mdc)

**/*.{js,jsx,ts,tsx}: 使用 OxLint 进行代码检查
Log和注释使用英文书写

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
src/{main,renderer}/**/*.ts

📄 CodeRabbit inference engine (.cursor/rules/electron-best-practices.mdc)

src/{main,renderer}/**/*.ts: Use context isolation for improved security
Implement proper inter-process communication (IPC) patterns
Optimize application startup time with lazy loading
Implement proper error handling and logging for debugging

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
src/main/**/*.ts

📄 CodeRabbit inference engine (.cursor/rules/electron-best-practices.mdc)

Use Electron's built-in APIs for file system and native dialogs

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
**/*.{ts,tsx}

📄 CodeRabbit inference engine (.cursor/rules/error-logging.mdc)

**/*.{ts,tsx}: 始终使用 try-catch 处理可能的错误
提供有意义的错误信息
记录详细的错误日志
优雅降级处理
日志应包含时间戳、日志级别、错误代码、错误描述、堆栈跟踪(如适用)、相关上下文信息
日志级别应包括 ERROR、WARN、INFO、DEBUG
不要吞掉错误
提供用户友好的错误信息
实现错误重试机制
避免记录敏感信息
使用结构化日志
设置适当的日志级别

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
src/main/presenter/llmProviderPresenter/providers/*.ts

📄 CodeRabbit inference engine (.cursor/rules/llm-agent-loop.mdc)

src/main/presenter/llmProviderPresenter/providers/*.ts: Each file in src/main/presenter/llmProviderPresenter/providers/*.ts should handle interaction with a specific LLM API, including request/response formatting, tool definition conversion, native/non-native tool call management, and standardizing output streams to a common event format.
Provider implementations must use a coreStream method that yields standardized stream events to decouple the main loop from provider-specific details.
The coreStream method in each Provider must perform a single streaming API request per conversation round and must not contain multi-round tool call loop logic.
Provider files should implement helper methods such as formatMessages, convertToProviderTools, parseFunctionCalls, and prepareFunctionCallPrompt as needed for provider-specific logic.
All provider implementations must parse provider-specific data chunks and yield standardized events for text, reasoning, tool calls, usage, errors, stop reasons, and image data.
When a provider does not support native function calling, it must prepare messages using prompt wrapping (e.g., prepareFunctionCallPrompt) before making the API call.
When a provider supports native function calling, MCP tools must be converted to the provider's format (e.g., using convertToProviderTools) and included in the API request.
Provider implementations should aggregate and yield usage events as part of the standardized stream.
Provider implementations should yield image data events in the standardized format when applicable.
Provider implementations should yield reasoning events in the standardized format when applicable.
Provider implementations should yield tool call events (tool_call_start, tool_call_chunk, tool_call_end) in the standardized format.
Provider implementations should yield stop events with appropriate stop_reason in the standardized format.
Provider implementations should yield error events in the standardized format...

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
src/main/**/*.{ts,js,tsx,jsx}

📄 CodeRabbit inference engine (.cursor/rules/project-structure.mdc)

主进程代码放在 src/main

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
**/*.{ts,tsx,js,vue}

📄 CodeRabbit inference engine (CLAUDE.md)

Use English for all logs and comments

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
**/*.{ts,tsx,vue}

📄 CodeRabbit inference engine (CLAUDE.md)

Enable and adhere to strict TypeScript typing (avoid implicit any, prefer precise types)

Use PascalCase for TypeScript types and classes

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
src/main/presenter/**/*.ts

📄 CodeRabbit inference engine (AGENTS.md)

Place Electron main-process presenters under src/main/presenter/ (Window, Tab, Thread, Mcp, Config, LLMProvider)

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
**/*.{ts,tsx,js,jsx,vue,css,scss,md,json,yml,yaml}

📄 CodeRabbit inference engine (AGENTS.md)

Prettier style: single quotes, no semicolons, print width 100; run pnpm run format

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
**/*.{ts,tsx,js,jsx,vue}

📄 CodeRabbit inference engine (AGENTS.md)

**/*.{ts,tsx,js,jsx,vue}: Use OxLint for JS/TS code; keep lint clean
Use camelCase for variables and functions
Use SCREAMING_SNAKE_CASE for constants

Files:

  • src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts
🧬 Code graph analysis (1)
src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts (1)
scripts/fetch-provider-db.mjs (1)
  • models (37-37)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (1)
  • GitHub Check: build-check (x64)

Comment on lines +19 to +67
const OAUTH_MODEL_LIST = {
data: [
{
created_at: '2025-09-29T00:00:00Z',
display_name: 'Claude Sonnet 4.5',
id: 'claude-sonnet-4-5-20250929',
type: 'model'
},
{
created_at: '2025-05-14T00:00:00Z',
display_name: 'Claude Sonnet 4',
id: 'claude-sonnet-4-20250514',
type: 'model'
},
{
created_at: '2025-05-14T00:00:00Z',
display_name: 'Claude Opus 4',
id: 'claude-opus-4-20250514',
type: 'model'
},
{
created_at: '2025-02-19T00:00:00Z',
display_name: 'Claude 3.7 Sonnet',
id: 'claude-3-7-sonnet-20250219',
type: 'model'
},
{
created_at: '2024-10-22T00:00:00Z',
display_name: 'Claude 3.5 Sonnet',
id: 'claude-3-5-sonnet-20241022',
type: 'model'
},
{
created_at: '2024-10-22T00:00:00Z',
display_name: 'Claude 3.5 Haiku',
id: 'claude-3-5-haiku-20241022',
type: 'model'
},
{
created_at: '2024-06-20T00:00:00Z',
display_name: 'Claude 3.5 Sonnet (Legacy)',
id: 'claude-3-5-sonnet-20240620',
type: 'model'
}
],
first_id: 'claude-sonnet-4-5-20250929',
has_more: false,
last_id: 'claude-3-5-sonnet-20240620'
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Add documentation and consider staleness of hardcoded model list.

The static OAUTH_MODEL_LIST introduces a maintenance burden as it requires manual updates when Anthropic releases new models or deprecates old ones. Consider:

  1. Add a comment explaining why OAuth mode uses a static list (e.g., API limitations, OAuth token restrictions)
  2. Add a comment with the last update date
  3. Document the process for keeping this list current

Apply this diff to add documentation:

+// Static model list for OAuth mode to avoid calling /v1/models API
+// OAuth tokens may have restricted access to the models endpoint
+// Last updated: 2025-10-14
+// TODO: Establish a process to keep this list synchronized with Anthropic's latest models
 const OAUTH_MODEL_LIST = {
   data: [

To verify the current model list is accurate and complete, please run:


🌐 Web query:

What are the latest Claude models available from Anthropic as of October 2025? Include model IDs and release dates.

💡 Result:

Here are Anthropic’s latest Claude models available as of October 14, 2025 (model ID — release date). Sources: Anthropic docs and Anthropic announcements. [1][2]

  • Claude Sonnet 4.5 — claude-sonnet-4-5-20250929 — September 29, 2025. [1][2]
  • Claude Opus 4.1 — claude-opus-4-1-20250805 — August 5, 2025. [1]
  • Claude Sonnet 4 — claude-sonnet-4-20250514 — May 14, 2025. [1]
  • Claude Opus 4 — claude-opus-4-20250514 — May 14, 2025. [1]
  • Claude Sonnet 3.7 — claude-3-7-sonnet-20250219 — February 19, 2025. [1]
  • Claude Haiku 3.5 — claude-3-5-haiku-20241022 — October 22, 2024. [1]

If you’d like, I can also list which of these are active vs. deprecated/retired per Anthropic’s deprecation schedule. [1][2]


Update static model list to reflect current Anthropic offerings.

  • Add claude-opus-4-1-20250805 /* Claude Opus 4.1 — Aug 5, 2025 */
  • Remove deprecated claude-3-5-sonnet-20240620 /* Legacy */
  • Add comment:
    // Static OAuth model list – last updated 2025-10-14
    // TODO: establish process to sync with Anthropic’s /v1/models endpoint
🤖 Prompt for AI Agents
In src/main/presenter/llmProviderPresenter/providers/anthropicProvider.ts around
lines 19 to 67, update the static OAUTH_MODEL_LIST: add a new model entry for
claude-opus-4-1-20250805 with created_at '2025-08-05T00:00:00Z', display_name
'Claude Opus 4.1' and type 'model'; remove the deprecated legacy entry with id
'claude-3-5-sonnet-20240620'; update first_id to the newest model id (e.g.,
'claude-sonnet-4-5-20250929' remains or set to the actual newest if you insert a
newer one) and set last_id to the new last model id (e.g.,
'claude-3-5-haiku-20241022'); and add the two comment lines at the top of the
block: "// Static OAuth model list – last updated 2025-10-14" and "// TODO:
establish process to sync with Anthropic’s /v1/models endpoint".

@zerob13 zerob13 merged commit 2cc1463 into dev Oct 14, 2025
2 checks passed
@zerob13 zerob13 deleted the codex/update-anthropicprovider-to-return-static-models branch January 6, 2026 12:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant