This repository contains a complete local AI stack setup using Docker Compose and Ansible. The stack includes several AI-related services, such as Ollama, Open-WebUI, SearxNG, Stable Diffusion, Whisper, and more. The configuration ensures the integration of Traefik for reverse proxy and Let's Encrypt for SSL/TLS certificates.
For limited experience on macOS use compose-mac.yaml (Ollama should run separately, no stable diffusion support at the moment)
Huge thanks go to Techno Tim and his video Self-Hosted AI That's Actually Useful, and tutorial which inspired me to finally gather all these services into one Docker Compose stack.
- Docker
- Docker Compose
- Ansible
- NVIDIA GPU and drivers
This playbook is used to install docker and nvidia software and drivers.
-
Install Ansible:
sudo apt update sudo apt install ansible -y -
Clone the repository:
git clone https://github.com/DmitryBoiadji/ai-stack.git cd ai-stack -
Edit the
group_vars/all.ymlfile with your configuration. -
Configure your hosts file
-
Run the Ansible playbook:
ansible-playbook -i hosts playbook.yml
1 Create an .env file:
cp .env.example .env
2 Modify the .env file with your configuration
3 Start the Docker Compose stack:
docker-compose up -d
For more information please take a look at Techno Tim's tutorial
The stack includes the following services:
- Ollama: A platform for running and deploying language models locally
- Open-WebUI: A web interface for AI models.
- SearxNG: A privacy-respecting metasearch engine.
- Stable Diffusion: A deep learning, text-to-image model.
- Whisper: An AI-powered transcription and translation service.
- MongoDB: A NoSQL database for storing Whisper data.
- LibreTranslate: An open-source machine translation API.
- Access Open-WebUI at
https://chat.your_app_domain - Access SearxNG at
http://searxng:8080/search?q=<query>(local docker network only) - Access Stable Diffusion at
https://sd.your_app_domain - Access Whisper at
https://whisper.your_app_domain
update_models.sh - Run on ollama docker to recursively update all models.
Contributions are welcome! Please fork the repository and create a pull request.
This project is licensed under the MIT License. See the LICENSE file for details.