Skip to content
#

glm

Here are 73 public repositories matching this topic...

Context-Engine: MCP retrieval stack for AI coding assistants. Hybrid code search (dense + lexical + reranker), ReFRAG micro-chunking, local LLM prompt enhancement, and dual SSE/RMCP endpoints. One command deploys Qdrant-powered indexing for Cursor, Windsurf, Roo, Cline, Codex, and any MCP client.

  • Updated Dec 20, 2025
  • Python

Improve this page

Add a description, image, and links to the glm topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the glm topic, visit your repo's landing page and select "manage topics."

Learn more