NexusAI is a modern, microservices-based customer support platform that leverages Artificial Intelligence to streamline ticket management, provide automated solutions, and offer real-time analytics.
- Intelligent Ticket Management: Create, track, and manage support tickets with a modern UI.
- AI-Powered Resolutions:
- Automated Suggestions: The system uses Ollama (Llama3) to analyze ticket descriptions and suggest potential solutions automatically.
- Vector Search: Uses
pgvectorfor semantic search to find similar past tickets.
- Real-Time Analytics:
- Live dashboard showing ticket volume, agent performance, and system health.
- Powered by MongoDB and optimized for read-heavy operations.
- Event-Driven Architecture:
- Uses RabbitMQ for asynchronous communication (e.g., sending notifications when a ticket is created or resolved).
- Secure Authentication:
- Integrated with Keycloak for robust identity access management.
- Resolution Workflow:
- Agents can resolve tickets with a required "Resolution" description, which helps finer-tune future AI suggestions.
- Discovery Server: Netflix Eureka
- API Gateway: Spring Cloud Gateway
- Ticket Service: Main business logic (PostgreSQL)
- AI Service: LLM integration & Vector Store (
pgvector) - Analytics Service: Dashboard stats (MongoDB)
- Notification Service: Email/Alert handling
- Orchestration: Docker Compose
- Framework: React.js (Vite)
- Styling: Tailwind CSS
- Icons: Lucide React
- Charts: Recharts
- State Management: React Hooks
- Databases: PostgreSQL (with
pgvector), MongoDB - Message Broker: RabbitMQ
- Auth: Keycloak
- LLM Runtime: Ollama (Running Llama3 locally in Docker)
- Tracing: Zipkin
| Service | Port | Description |
|---|---|---|
| API Gateway | 9090 |
Unified entry point for the frontend. |
| Discovery | 8761 |
Service registry. |
| Ticket Service | 8081 |
Manages tickets CRUD operations. |
| AI Service | 8082 |
Handles AI inference and vector embeddings. |
| Notification | 8083 |
Listens to events and sends alerts. |
| Analytics | 8084 |
Aggregates data for the dashboard. |
| Keycloak | 8080 |
IAM provider. |
| Zipkin | 9411 |
Distributed tracing. |
- Docker Desktop (must be running, with at least 8GB RAM allocated for Ollama)
- Node.js (v18+)
- Java JDK 17+ (optional, if running locally without Docker)
-
Clone the Repository
git clone https://github.com/Start-Up-Arch/NexusAI.git cd NexusAI -
Start Infrastructure & Backend This command will spin up all microservices, databases, and the LLM container. Note: The first run may take a while to pull the Llama3 model.
docker-compose up -d
-
Start Frontend Open a new terminal window:
cd frontend npm install npm run dev -
Access the Application
- Frontend: http://localhost:5173 (or port shown in terminal)
- Eureka Dashboard: http://localhost:8761
- Zipkin Tracing: http://localhost:9411
Real-time overview of system metrics.

Grid view of open tickets with status filtering. (Add screenshot here)
Intelligent chat interface for automated support.

Modal to enter resolution details when closing a ticket. (Add screenshot here)
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
License: MIT