It seems that the Ollama client is hardcoded and ignores CONTEXTPLUS_EMBED_PROVIDER, CONTEXTPLUS_OPENAI_BASE_URL, CONTEXTPLUS_OPENAI_API_KEY, and CONTEXTPLUS_OPENAI_EMBED_MODEL entirely — making it impossible to use an OpenAI-compatible embedding server as documented in the MCP config.
It seems that the Ollama client is hardcoded and ignores CONTEXTPLUS_EMBED_PROVIDER, CONTEXTPLUS_OPENAI_BASE_URL, CONTEXTPLUS_OPENAI_API_KEY, and CONTEXTPLUS_OPENAI_EMBED_MODEL entirely — making it impossible to use an OpenAI-compatible embedding server as documented in the MCP config.