You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
chore(v0.2.2): wrap up — README, CHANGELOG, version, gitignore, CLAUDE.md
- README: full rewrite for v0.2.2 — supercar-themed lead paragraph, local LLM
section with igllama usage, session summary example, tests badge (195/195),
Showcase link in docs section, igllama in inspiration table
- CHANGELOG: add [0.2.2] entry covering all changes in this release
- build.zig.zon: bump version 0.2.1 → 0.2.2
- src/main.zig: bump VERSION constant and test 0.2.1 → 0.2.2
- .gitignore: add .powerglide/ (runtime session dir written by powerglide run)
- CLAUDE.md: add igllama section, current version note, update roadmap to
include items 8-10 (igllama, session summary, showcase)
195/195 tests passing.
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Copy file name to clipboardExpand all lines: CHANGELOG.md
+14Lines changed: 14 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,20 @@
2
2
3
3
All notable changes to the **powerglide** project will be documented in this file.
4
4
5
+
## [0.2.2] - 2026-03-04
6
+
7
+
### Added
8
+
-**Session summary output** — `powerglide run` now emits a structured completion block showing steps taken, elapsed time, agent/model name, and `<POWERGLIDE_DONE>` or `<POWERGLIDE_ERROR>` terminal signal
9
+
-**igllama port scanning** — `powerglide doctor` scans `:8090–8099` instead of hardcoded `:8090`; reports all running igllama instances
10
+
-**`json_mode` on `OpenAIClient`** — sets `response_format: {"type":"json_object"}` for constrained JSON generation on igllama and other local endpoints
-**Showcase nav link** — added to both `page.shtml` and `home.shtml` nav bars with amber highlight
13
+
-**Tests badge** in README (195/195)
14
+
15
+
### Fixed
16
+
-**`Loop step count increments` test** — now uses `/tmp/test_loop_step_count.json` instead of default `.powerglide/session.json`; previously picked up real session state from dogfooding runs and failed
17
+
-**README** — updated to v0.2.2 with local LLM section, session summary example, igllama integration, Showcase link, and tests badge
**powerglide** is a high-performance CLI coding agent runtime built in [Zig 0.15.2](https://ziglang.org/). It provides a robust, fault-tolerant execution layer for LLM swarms, designed specifically for autonomous coding tasks that require high throughput and verifiable correctness.
21
+
Like a finely tuned transmission at full throttle, **powerglide**swerves through your codebase with precision, force, and grace. It is the layer between you and a swarm of LLM-driven engineers working in parallel — written in [Zig 0.15.2](https://ziglang.org/), compiled to a single static binary with zero runtime dependencies, and built around one non-negotiable constraint: the agent loop must be reliable enough to run unattended.
21
22
22
-
Unlike open-ended agent loops that can drift or hang, powerglide is built around the **Ralph Loop** — an explicit 11-state machine that ensures every step is auditable, every tool call is isolated in its own PTY, and every session terminates with a clear protocol signal.
23
+
The foundation is the **Ralph Loop** — an explicit 11-state machine that sequences every agent action from task intake through tool execution to verified completion. No implicit flow, no silent exits. Every session terminates with `<POWERGLIDE_DONE>` or `<POWERGLIDE_ERROR>`. The loop drives the model; the model does not drive the loop.
23
24
24
25
```bash
25
26
$ powerglide run --agent hephaestus --velocity 2.0 "refactor the auth module to use the new session manager"
@@ -29,11 +30,13 @@ $ powerglide run --agent hephaestus --velocity 2.0 "refactor the auth module to
29
30
30
31
## Core Pillars
31
32
32
-
-**The Ralph Loop** 🔄 — An explicit state machine (`idle → load_tasks → pick_task → thinking → tool_call → executing → observing → verify → commit → done`) that sequences cognition and action. No silent failures, no implicit transitions.
33
-
-**Velocity Control** 🚀 — Precision control over agent throughput. Velocity is a floating-point multiplier (f64) on a 1000ms base delay (`delay_ms = 1000 / velocity`). Default is `1.0` (1000ms). Speed up (`--velocity 2.0` = 500ms) or slow down (`--velocity 0.5` = 2000ms) as needed.
34
-
-**Reliable PTYs** 💻 — Every tool execution happens in a real pseudoterminal. Exit codes are captured via `waitpid` with a `/proc` fallback, ensuring that `zig build` or `pytest` results are 100% reliable.
35
-
-**Rogue Agent Prevention** 🛡️ — Defense in depth with step limits, heartbeat monitoring, circuit breakers for repeat tool calls, and budget tracking. Rogue agents are killed before they can damage your codebase.
36
-
-**Multi-Model Routing** 🤖 — Native support for Anthropic (Claude), OpenAI, and any OpenAI-compatible endpoint. Automatic fallback chains ensure that rate limits don't stop your workflow.
33
+
-**The Ralph Loop** 🔄 — Explicit 11-state machine: `idle → load_tasks → pick_task → thinking → tool_call → executing → observing → verify → commit → done`. Every step is auditable; every session ends with a deterministic terminal signal.
34
+
-**Velocity Control** 🚀 — Precision control over agent throughput. `delay_ms = 1000 / velocity`. Speed up (`--velocity 2.0` = 500ms/step) or slow down (`--velocity 0.5` = 2000ms/step) without restarting the session. Agents can self-throttle mid-run.
35
+
-**Reliable PTYs** 💻 — Every tool runs in a real pseudoterminal. Exit codes captured via `waitpid` with WNOHANG polling and a `/proc/<pid>/status` fallback — so `zig build`, `pytest`, and `bash` all deliver trustable results to the VERIFY state.
36
+
-**Rogue Agent Prevention** 🛡️ — Step limits, heartbeat monitoring (30s), circuit breakers for repeated tool calls, and budget tracking. Stuck agents are killed before they accumulate diverged work.
37
+
-**Multi-Model Routing** 🤖 — Anthropic (Claude), OpenAI, and any OpenAI-compatible endpoint (Ollama, [igllama](https://github.com/bkataru/igllama), NVIDIA NIM, Together AI). Fallback chains keep sessions alive through provider outages.
38
+
-**MCP Integration** 🔌 — Run as an MCP server (`powerglide mcp`) or connect to external MCP servers as a client. External tools get prefixed names and become first-class tools in the registry.
39
+
-**Local LLM Support** 🏠 — Pre-configured `local` and `local4b` agents route to igllama on `:8090`/`:8091`. No API keys required for exploration and triage workflows.
37
40
38
41
---
39
42
@@ -47,136 +50,152 @@ $ powerglide run --agent hephaestus --velocity 2.0 "refactor the auth module to
47
50
48
51

49
52
53
+
### The Ralph Loop
54
+
55
+

56
+
50
57
---
51
58
52
59
## Quick Start
53
60
54
61
### Prerequisites
55
62
56
-
-[Zig 0.15.2](https://ziglang.org/download/) — `mise install zig@0.15.2` or from official binaries.
57
-
- An API key for your provider (set `ANTHROPIC_API_KEY` or `OPENAI_API_KEY`).
63
+
-[Zig 0.15.2](https://ziglang.org/download/) — `mise install zig@0.15.2` or official binaries
64
+
- An API key for your provider (`ANTHROPIC_API_KEY` or `OPENAI_API_KEY`), **or** run [igllama](https://github.com/bkataru/igllama) locally for a fully offline stack
|`tools`| List and test available MCP-style tools |
97
-
|`tui`| Launch the multi-panel dashboard |
118
+
|`tools`| List and test available tools |
119
+
|`mcp`| Start powerglide as an MCP server |
120
+
|`tui`| Launch the multi-panel vxfw dashboard |
98
121
|`doctor`| Run system health checks |
122
+
|`version`| Show version |
99
123
100
124
---
101
125
102
126
## MCP Integration
103
127
104
-
powerglide speaks [Model Context Protocol](https://modelcontextprotocol.io/) natively — both as a server and as a client.
128
+
powerglide speaks [Model Context Protocol](https://modelcontextprotocol.io/) natively — as both server and client.
105
129
106
130
### As an MCP Server
107
131
108
-
Run powerglide as an MCP server to expose its tools to any MCP-compatible client (Claude Desktop, another powerglide instance, or any JSON-RPC 2.0 client over stdin/stdout):
109
-
110
132
```bash
111
133
powerglide mcp
112
134
```
113
135
114
-
The server advertises all registered tools via `tools/list` and handles `tools/call` requests. Protocol sequence:
115
-
1. Client sends `initialize` — server responds with `protocolVersion: "2024-11-05"` and `capabilities.tools`
116
-
2. Client sends `tools/list` — server returns the full tool registry as MCP tool descriptors
117
-
3. Client sends `tools/call` — server executes the tool and returns `content: [{type: "text", text: "..."}]`
136
+
Exposes all registered tools via JSON-RPC 2.0 over stdin/stdout. Any MCP-compatible client (Claude Desktop, another powerglide instance) can call powerglide tools.
118
137
119
138
### As an MCP Client
120
139
121
-
Connect powerglide to external MCP servers to bring their tools into its registry as first-class powerglide tools. Add `mcp_servers` to your config:
140
+
Add `mcp_servers` to `~/.config/powerglide/config.json`:
External tools are registered with prefixed names (`mcp_filesystem_read_file`, `mcp_github_search_repositories`) and become indistinguishable from built-in tools to the agent loop.
153
+
External tools register as `mcp_filesystem_read_file` etc. and are indistinguishable from built-in tools to the agent loop.
139
154
140
155
---
141
156
142
-
## The Ralph Loop
157
+
## Local LLM Dogfooding
143
158
144
-
Every powerglide agent session is driven by the **Ralph Loop**— an explicit state machine that enforces a strict sequence from task intake through tool execution to completion.
159
+
powerglide ships with igllama integration for fully local inference. See the **[Showcase](https://bkataru.github.io/powerglide/showcase/)**for case studies with Qwen3.5 0.8B and 4B models — including tool calling triage, session summary analysis, and the honest performance table.
145
160
146
-

161
+
```bash
162
+
# Start igllama (Zig-based Ollama alternative)
163
+
igllama api Qwen3.5-0.8B-Q8_0.gguf --port 8090 --no-think &
164
+
igllama api Qwen3.5-4B-Q8_0.gguf --port 8091 --no-think &
165
+
166
+
# Doctor detects both automatically
167
+
powerglide doctor
168
+
# OK igllama: running on :8090 (local agent available)
169
+
# OK igllama: running on :8091 (local agent available)
170
+
```
147
171
148
172
---
149
173
150
174
## Inspiration
151
175
152
-
powerglide is inspired by the best ideas from the AI coding agent ecosystem:
176
+
powerglide synthesizes the strongest ideas from the AI coding agent ecosystem:
|[igllama](https://github.com/bkataru/igllama)| Local GGUF inference, OpenAI-compatible API for Qwen3.5 |
170
191
171
192
---
172
193
173
-
## For AI Agents
174
-
175
-
Powerglide is designed to be wielded by other agents. See [AGENTS.md](AGENTS.md) for the protocol specification and integration guide.
176
-
177
194
## Documentation
178
195
179
-
Full documentation is available at [bkataru.github.io/powerglide](https://bkataru.github.io/powerglide).
196
+
Full docs at **[bkataru.github.io/powerglide](https://bkataru.github.io/powerglide)** — including [Architecture](https://bkataru.github.io/powerglide/architecture/), [CLI Reference](https://bkataru.github.io/powerglide/cli-reference/), [Configuration](https://bkataru.github.io/powerglide/configuration/), and the [Showcase](https://bkataru.github.io/powerglide/showcase/).
197
+
198
+
For AI agents working with powerglide programmatically, see [AGENTS.md](AGENTS.md).
0 commit comments