💡 Feature Description
A clear and concise description of the feature you'd like to see.
Allowing Gemini Code Flow (and Gemini CLI) to use OpenRouter or other OpenAI-API-compatible LLM routing tools (LiteLLM is BYOK, Ollama requires local GPU), such that people are not fully bound by Google.
🤔 Problem or Use Case
Is your feature request related to a problem? Please describe.
Assume that Gemini is not universal across the world, either due to locational constraints or financial constraints. We know that certain models are both cheaper and sometimes better than Gemini (DeepSeek and Qwen), so having options would be a good idea to access Gemini Code Flow https://livebench.ai/#/
💭 Proposed Solution
Possible adoption of the following two tools
🔄 Alternatives Considered
✨ Additional Context
Similar questions clduab11/gemini-flow#7
🎯 Impact
Who would benefit from this feature?
Thanks for helping make Gemini Code Flow better! 🚀
💡 Feature Description
A clear and concise description of the feature you'd like to see.
Allowing Gemini Code Flow (and Gemini CLI) to use OpenRouter or other OpenAI-API-compatible LLM routing tools (LiteLLM is BYOK, Ollama requires local GPU), such that people are not fully bound by Google.
🤔 Problem or Use Case
Is your feature request related to a problem? Please describe.
Assume that Gemini is not universal across the world, either due to locational constraints or financial constraints. We know that certain models are both cheaper and sometimes better than Gemini (DeepSeek and Qwen), so having options would be a good idea to access Gemini Code Flow https://livebench.ai/#/
💭 Proposed Solution
Possible adoption of the following two tools
🔄 Alternatives Considered
✨ Additional Context
Similar questions clduab11/gemini-flow#7
🎯 Impact
Who would benefit from this feature?
Thanks for helping make Gemini Code Flow better! 🚀