Skip to content

khokonm/ollama-gui

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama GUI

A modern, user-friendly web interface for interacting with locally installed LLM models via Ollama. This project provides an intuitive chat interface that allows you to communicate with various language models running on your local machine.

Features

  • 🚀 Clean, modern chat interface
  • 💬 Real-time streaming responses
  • 🔄 Dynamic model switching
  • 🎨 Markdown and code syntax highlighting
  • 📱 Responsive design for all devices

Prerequisites

  1. Node.js (v18 or higher)
  2. Ollama - for running LLM models locally

Setup

  1. Install Ollama and download your preferred models:

    Install Ollama from https://ollama.ai/ or run the following command:

    curl -fsSL https://ollama.com/install.sh | sh

    Then pull your desired models, for example:

    ollama pull gemma:3b
  2. Clone and set up the project:

    git clone git@github.com:khokonm/ollama-gui.git
    cd ollama-gui
    npm install
  3. Start the development server:

    npm run dev
  4. Open http://localhost:3000 in your browser

Usage

  1. Select your preferred model from the dropdown menu in the header
  2. Type your message in the input field
  3. Press Enter or click the Send button to chat with the model
  4. The model's responses will stream in real-time with proper formatting for code and markdown

Development

This project is built with:

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

About

A simple, self-hosted chat UI for running local AI models using Ollama.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors