An AI-powered desktop application that provides intelligent code explanations using the Mistral language model. Select code from any application, press a global shortcut, and receive detailed explanations in a beautiful popup window.
- Global Shortcuts: Trigger code translation from anywhere with
Cmd+Shift+T(Mac) orCtrl+Shift+T(Windows/Linux) - AI-Powered Explanations: Uses Ollama with Mistral model for intelligent code analysis
- Smart Language Detection: Automatically identifies programming languages with confidence scoring
- Multiple Detail Levels: Choose between Beginner, Intermediate, and Expert explanations
- Always-on-Top Toolbar: Minimal toolbar that stays visible for quick access
- Context-Aware: Integrates relevant project files for better explanations
- Beautiful UI: Modern, dark-themed interface with smooth animations
- Cross-Platform: Works on macOS, Windows, and Linux
- Backend: Electron main process with TypeScript
- Frontend: React components with TypeScript
- AI Integration: Ollama HTTP API with Mistral model
- Communication: Secure IPC bridge between processes
- Build System: Webpack + TypeScript compilation
Before running this application, ensure you have:
- Node.js (v16 or higher)
- Ollama installed and running locally
- Mistral model pulled (
ollama pull mistral:latest)
curl -fsSL https://ollama.ai/install.sh | shDownload from https://ollama.ai/download
curl -fsSL https://ollama.ai/install.sh | shollama pull mistral:latest-
Clone the repository
git clone <repository-url> cd codelens-translator
-
Install dependencies
npm install
-
Build the application
npm run build
-
Start the application
npm start
npm run devThis command will:
- Build the renderer process with webpack
- Compile the main process with TypeScript
- Start Electron in development mode
- Enable hot reloading
# Build both main and renderer processes
npm run build
# Build only the main process
npm run build:main
# Build only the renderer process
npm run build:renderer
# Package the application
npm run package
## π¦ Distribution
### Building for Distribution
To create distributable packages for users:
```bash
# Build for all platforms (requires cross-compilation setup)
npm run dist
# Build for specific platform
npm run dist:mac # macOS (DMG + ZIP)
npm run dist:win # Windows (NSIS + Portable)
npm run dist:linux # Linux (AppImage + DEB)
# Use the automated build script
./scripts/build-dist.sh # macOS/Linux
scripts\build-dist.bat # WindowsAfter building, you'll find the packages in the build/ directory:
- macOS:
.dmginstaller and.ziparchive - Windows:
.exeinstaller and portable.exe - Linux:
.AppImageand.debpackage
Users can download and install the app using the INSTALLATION.md guide.
- macOS: Requires code signing for distribution outside App Store
- Windows: May require code signing for SmartScreen compatibility
- Linux: AppImage works on most distributions
- Select code in any application (VS Code, browser, text editor, etc.)
- Press the global shortcut:
Cmd+Shift+T(Mac) orCtrl+Shift+T(Windows/Linux) - Wait for AI processing - the app will automatically copy your selection and analyze it
- View the explanation in a popup window with your chosen detail level
- Status Indicator: Shows connection status and current state
- Context Files: Displays number of relevant project files
- Shortcut Info: Shows available keyboard shortcuts
- Detail Level Toggle: Switch between Beginner, Intermediate, and Expert explanations
- Code Viewer: Displays the selected code with syntax highlighting
- AI Explanation: Shows the generated explanation in markdown format
- Resizable Interface: Drag the bottom-right corner to resize the window
| Action | Mac | Windows/Linux |
|---|---|---|
| Translate Code | Cmd+Shift+T |
Ctrl+Shift+T |
| Toggle Toolbar | Cmd+Shift+H |
Ctrl+Shift+H |
The application connects to Ollama at http://127.0.0.1:11434 by default. You can modify this in src/main/services/ollama.service.ts:
private readonly baseUrl = 'http://127.0.0.1:11434';
private readonly model = 'mistral:latest';
private readonly timeout = 60000; // 60 secondsAdjust AI model parameters in the Ollama service:
options: {
temperature: 0.7, // Creativity level (0.0 - 1.0)
top_p: 0.9, // Nucleus sampling
max_tokens: 2000 // Maximum response length
}The application uses CSS for styling. Main styles are in the HTML files:
src/renderer/toolbar.html- Toolbar stylingsrc/renderer/explanation.html- Explanation window styling
Add new programming languages in src/main/services/code-analysis.service.ts:
patterns.set('your-language', {
keywords: ['keyword1', 'keyword2'],
extensions: ['ext1', 'ext2'],
patterns: [/regex1/, /regex2/]
});-
"Ollama Not Running" Error
- Ensure Ollama is installed and running
- Check that it's accessible at
http://127.0.0.1:11434 - Run
ollama serveto start the service
-
Model Not Found
- Pull the Mistral model:
ollama pull mistral:latest - Check available models:
ollama list
- Pull the Mistral model:
-
Global Shortcuts Not Working
- Ensure the application has focus
- Check system permissions for global shortcuts
- Restart the application
-
Build Errors
- Clear
dist/directory:rm -rf dist/ - Reinstall dependencies:
rm -rf node_modules && npm install - Check TypeScript configuration
- Clear
Enable debug logging by setting the environment variable:
DEBUG=* npm startsrc/
βββ main/ # Electron main process
β βββ main.ts # Main application logic
β βββ preload.ts # IPC bridge setup
β βββ services/ # AI and analysis services
β βββ ollama.service.ts
β βββ code-analysis.service.ts
βββ renderer/ # Frontend components
βββ toolbar.tsx # Main toolbar component
βββ explanation.tsx # Explanation popup component
βββ toolbar.html # Toolbar HTML template
βββ explanation.html # Explanation HTML template
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Make your changes
- Add tests if applicable
- Commit your changes:
git commit -am 'Add feature' - Push to the branch:
git push origin feature-name - Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for providing the local AI inference platform
- Mistral AI for the powerful language model
- Electron for the cross-platform desktop framework
- React for the component-based UI library
If you encounter any issues or have questions:
- Check the troubleshooting section
- Search existing issues
- Create a new issue with detailed information
- Include your operating system, Node.js version, and error logs
Happy coding! π