Use your ChatGPT Plus/Pro subscription with OpenAI API clients (like gptel) via the Codex API.
-
Build:
go build -o chatgpt-proxy
-
Authenticate (one-time):
./chatgpt-proxy auth
Open the URL in your browser and log in with your ChatGPT account.
-
Start the proxy:
./chatgpt-proxy serve
-
Configure gptel:
(gptel-make-openai "ChatGPT" :host "localhost:8080" :protocol "http" :key "dummy" :models '(gpt-5.3-codex) :stream t) (setq gptel-backend (gptel-get-backend "ChatGPT") gptel-model 'gpt-5.3-codex)
# Authenticate with ChatGPT
./chatgpt-proxy auth
# Start proxy on default port (8080)
./chatgpt-proxy serve
# Start proxy on custom port
./chatgpt-proxy serve :9090
# Enable debug logging
LOG_LEVEL=debug ./chatgpt-proxy serve- Transforms OpenAI API format to ChatGPT Codex format
- Streaming responses
- Tool/function calling with parallel tool calls
- Automatic OAuth token refresh
- Tokens stored in
~/.config/chatgpt-proxy/tokens.json
sudo cp chatgpt-proxy /usr/local/bin/
sudo cp chatgpt-proxy.service /etc/systemd/system/
sudo systemctl daemon-reload
sudo systemctl enable --now chatgpt-proxyEdit the service file first to set your username.
gptel-codex.el is a pure Emacs Lisp backend that talks directly to the Codex API without the proxy. Still requires chatgpt-proxy auth once to obtain tokens.
(add-to-list 'load-path "/path/to/chatgpt-proxy")
(require 'gptel-codex)
;; Create the backend
(gptel-make-codex "Codex"
:models '(gpt-5.3-codex)
:stream t)
;; Use it
(setq gptel-backend (gptel-get-backend "Codex")
gptel-model 'gpt-5.3-codex)Both the proxy and direct backend support gptel's tool calling:
(setq gptel-tools
(list
(gptel-make-tool
:name "get_weather"
:description "Get weather for a location"
:args '((:name "location" :type "string"))
:function (lambda (loc) (format "Weather in %s: Sunny, 22C" loc)))))
(setq gptel-use-tools t)
;; Now gptel-send will use tools automaticallyThe Codex API model availability depends on your ChatGPT subscription. Use the full model name with -codex suffix:
gpt-5.3-codex(ChatGPT Plus/Pro)
MIT