Skip to content

fix: increase Vercel maxDuration to 300s for AI chat operations#1062

Merged
hotlong merged 1 commit intomainfrom
claude/fix-vercel-runtime-timeout
Apr 2, 2026
Merged

fix: increase Vercel maxDuration to 300s for AI chat operations#1062
hotlong merged 1 commit intomainfrom
claude/fix-vercel-runtime-timeout

Conversation

@Claude
Copy link
Copy Markdown
Contributor

@Claude Claude AI commented Apr 2, 2026

AI chat endpoint timing out after 60 seconds on Vercel deployment (play.objectstack.ai/api/v1/ai/chat).

Changes

  • apps/studio/vercel.json: Increased maxDuration from 60s to 300s (5 minutes)

Context

AI chat operations with tool calling frequently exceed 60s due to:

  • Multiple LLM round trips for tool execution
  • Agent planning iterations (maxIterations)
  • Tool registry execution overhead
  • LLM provider latency

300s provides sufficient headroom while staying within Vercel Enterprise tier limits (900s max).

@vercel
Copy link
Copy Markdown

vercel bot commented Apr 2, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
objectstack-play Building Building Preview, Comment Apr 2, 2026 9:42am
spec Building Building Preview, Comment Apr 2, 2026 9:42am

Request Review

Copilot AI review requested due to automatic review settings April 2, 2026 09:42
@hotlong hotlong merged commit 2477741 into main Apr 2, 2026
7 of 9 checks passed
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Increases the Vercel Serverless Function timeout for the Studio API to prevent play.objectstack.ai/api/v1/ai/chat from timing out during longer AI chat sessions with tool-calling and agent iteration.

Changes:

  • Increase Vercel maxDuration for api/**/*.js from 60s to 300s.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants