Skip to content

berkecftc/Employee-Ticket

Repository files navigation

NexusAI - AI-Powered Customer Support Platform

NexusAI is a modern, microservices-based customer support platform that leverages Artificial Intelligence to streamline ticket management, provide automated solutions, and offer real-time analytics.

🚀 Key Features

  • Intelligent Ticket Management: Create, track, and manage support tickets with a modern UI.
  • AI-Powered Resolutions:
    • Automated Suggestions: The system uses Ollama (Llama3) to analyze ticket descriptions and suggest potential solutions automatically.
    • Vector Search: Uses pgvector for semantic search to find similar past tickets.
  • Real-Time Analytics:
    • Live dashboard showing ticket volume, agent performance, and system health.
    • Powered by MongoDB and optimized for read-heavy operations.
  • Event-Driven Architecture:
    • Uses RabbitMQ for asynchronous communication (e.g., sending notifications when a ticket is created or resolved).
  • Secure Authentication:
    • Integrated with Keycloak for robust identity access management.
  • Resolution Workflow:
    • Agents can resolve tickets with a required "Resolution" description, which helps finer-tune future AI suggestions.

🛠️ Technology Stack

Backend (Spring Boot Microservices)

  • Discovery Server: Netflix Eureka
  • API Gateway: Spring Cloud Gateway
  • Ticket Service: Main business logic (PostgreSQL)
  • AI Service: LLM integration & Vector Store (pgvector)
  • Analytics Service: Dashboard stats (MongoDB)
  • Notification Service: Email/Alert handling
  • Orchestration: Docker Compose

Frontend

  • Framework: React.js (Vite)
  • Styling: Tailwind CSS
  • Icons: Lucide React
  • Charts: Recharts
  • State Management: React Hooks

Infrastructure & Tools

  • Databases: PostgreSQL (with pgvector), MongoDB
  • Message Broker: RabbitMQ
  • Auth: Keycloak
  • LLM Runtime: Ollama (Running Llama3 locally in Docker)
  • Tracing: Zipkin

📂 Architecture Overview

Service Port Description
API Gateway 9090 Unified entry point for the frontend.
Discovery 8761 Service registry.
Ticket Service 8081 Manages tickets CRUD operations.
AI Service 8082 Handles AI inference and vector embeddings.
Notification 8083 Listens to events and sends alerts.
Analytics 8084 Aggregates data for the dashboard.
Keycloak 8080 IAM provider.
Zipkin 9411 Distributed tracing.

🏁 Getting Started

Prerequisites

  • Docker Desktop (must be running, with at least 8GB RAM allocated for Ollama)
  • Node.js (v18+)
  • Java JDK 17+ (optional, if running locally without Docker)

Installation

  1. Clone the Repository

    git clone https://github.com/Start-Up-Arch/NexusAI.git
    cd NexusAI
  2. Start Infrastructure & Backend This command will spin up all microservices, databases, and the LLM container. Note: The first run may take a while to pull the Llama3 model.

    docker-compose up -d
  3. Start Frontend Open a new terminal window:

    cd frontend
    npm install
    npm run dev
  4. Access the Application

📸 Screenshots

Dashboard

Real-time overview of system metrics. Dashboard Screenshot

Ticket Management

Grid view of open tickets with status filtering. (Add screenshot here)

AI Assistant

Intelligent chat interface for automated support. AI Assistant Screenshot

Resolution Flow

Modal to enter resolution details when closing a ticket. (Add screenshot here)

🤝 Contributing

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

📄 License

License: MIT

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors