Skip to content

minikomi/chatgpt-codex-proxy

Repository files navigation

ChatGPT Codex Proxy

Use your ChatGPT Plus/Pro subscription with OpenAI API clients (like gptel) via the Codex API.

Quick Start

  1. Build:

    go build -o chatgpt-proxy
  2. Authenticate (one-time):

    ./chatgpt-proxy auth

    Open the URL in your browser and log in with your ChatGPT account.

  3. Start the proxy:

    ./chatgpt-proxy serve
  4. Configure gptel:

    (gptel-make-openai "ChatGPT"
      :host "localhost:8080"
      :protocol "http"
      :key "dummy"
      :models '(gpt-5.3-codex)
      :stream t)
    
    (setq gptel-backend (gptel-get-backend "ChatGPT")
          gptel-model 'gpt-5.3-codex)

Usage

# Authenticate with ChatGPT
./chatgpt-proxy auth

# Start proxy on default port (8080)
./chatgpt-proxy serve

# Start proxy on custom port
./chatgpt-proxy serve :9090

# Enable debug logging
LOG_LEVEL=debug ./chatgpt-proxy serve

Features

  • Transforms OpenAI API format to ChatGPT Codex format
  • Streaming responses
  • Tool/function calling with parallel tool calls
  • Automatic OAuth token refresh
  • Tokens stored in ~/.config/chatgpt-proxy/tokens.json

Install as Service

sudo cp chatgpt-proxy /usr/local/bin/
sudo cp chatgpt-proxy.service /etc/systemd/system/
sudo systemctl daemon-reload
sudo systemctl enable --now chatgpt-proxy

Edit the service file first to set your username.

gptel-codex.el (Direct Backend)

gptel-codex.el is a pure Emacs Lisp backend that talks directly to the Codex API without the proxy. Still requires chatgpt-proxy auth once to obtain tokens.

(add-to-list 'load-path "/path/to/chatgpt-proxy")
(require 'gptel-codex)

;; Create the backend
(gptel-make-codex "Codex"
  :models '(gpt-5.3-codex)
  :stream t)

;; Use it
(setq gptel-backend (gptel-get-backend "Codex")
      gptel-model 'gpt-5.3-codex)

With Tools

Both the proxy and direct backend support gptel's tool calling:

(setq gptel-tools
  (list
   (gptel-make-tool
    :name "get_weather"
    :description "Get weather for a location"
    :args '((:name "location" :type "string"))
    :function (lambda (loc) (format "Weather in %s: Sunny, 22C" loc)))))

(setq gptel-use-tools t)

;; Now gptel-send will use tools automatically

Available Models

The Codex API model availability depends on your ChatGPT subscription. Use the full model name with -codex suffix:

  • gpt-5.3-codex (ChatGPT Plus/Pro)

License

MIT

About

Proxy for codex backend to allow conversations API tools (EG. gptel)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors