Skip to content

hitthecodelabs/EnterpriseAIChat-Chainlit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

14 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Enterprise AI Chat Interface (Chainlit Frontend)

Status Python Async

πŸ“‹ Executive Summary

This repository contains a robust, asynchronous chat interface built with Chainlit. It is designed to serve as a secure frontend orchestrator between end-users and a proprietary AI Middleware.

The solution is engineered to handle Customer Service workflows (e.g., Order Tracking, Returns, General Inquiries) with a focus on high concurrency, state management, and user experience (UX).

Key Features

  • ⚑ Asynchronous Architecture: Utilizes httpx and asyncio for non-blocking I/O, ensuring high throughput and low latency compared to traditional synchronous requests.
  • 🧠 Session State Management: Implements persistent conversation history within the user session, allowing the AI to maintain context (memory) across multiple interactions.
  • πŸ”’ Security by Design: Strict separation of configuration and code via Environment Variables. No sensitive URLs or API keys are hardcoded.
  • ✨ Optimistic UI & UX: Features transitional states ("Thinking...", "Connecting to Specialist...") to manage user expectations and reduce perceived latency.
  • βš™οΈ Configurable Persona: The bot's name, role, and welcome message are fully customizable via environment variables without touching the codebase.

πŸ“Š System Diagrams

High-Level Architecture

The application acts as a stateless frontend layer that forwards user intent and history to a backend logic tier.

%%{init: {'theme': 'base', 'themeVariables': { 'primaryColor': '#ffffff', 'edgeLabelBackground':'#ffffff', 'tertiaryColor': '#f4f4f4'}}}%%
graph LR
    %% DefiniciΓ³n de Estilos (Clases)
    classDef person fill:#007bff,stroke:#0056b3,stroke-width:2px,color:#fff,rx:10,ry:10;
    classDef frontend fill:#e3f2fd,stroke:#2196f3,stroke-width:2px,color:#0d47a1,rx:5,ry:5;
    classDef backend fill:#fff3e0,stroke:#ff9800,stroke-width:2px,color:#e65100,rx:5,ry:5;
    classDef db fill:#f3e5f5,stroke:#9c27b0,stroke-width:2px,stroke-dasharray: 5 5;

    %% Nodos
    User("πŸ‘€ End User"):::person
    
    subgraph ClientSide ["πŸ–₯️ Frontend Layer"]
        direction TB
        CL["Chainlit App"]:::frontend
        SS[("Session Storage")]:::db
    end

    subgraph ServerSide ["☁️ Backend Infrastructure"]
        direction TB
        MW["🧠 AI Middleware"]:::backend
        AI["πŸ€– AI Models"]:::backend
    end

    %% Relaciones
    User <==>|WebSocket| CL
    CL <==>|Async HTTP/JSON| MW
    CL -.->|Persistencia| SS
    MW <--> AI
    
    %% Enlace invisible para alinear subgrafos si es necesario
    ClientSide ~~~ ServerSide
Loading

Message Flow Sequence

sequenceDiagram
    autonumber
    participant U as πŸ‘€ User
    participant CL as πŸ–₯️ Chainlit
    participant S as πŸ“¦ Session
    participant MW as 🧠 Middleware

    rect rgb(230, 245, 255)
        Note over CL: Chat Initialization
        CL->>CL: Validate ENV Variables
        alt Missing Config
            CL-->>U: ⚠️ System Error
        else Config Valid
            CL->>S: Initialize Empty History
            CL-->>U: πŸ‘‹ Welcome Message
        end
    end

    rect rgb(255, 243, 224)
        Note over U,MW: Message Processing Loop
        U->>CL: Send Message
        CL->>S: Store User Message
        CL-->>U: 🧠 "Thinking..."
        
        CL->>MW: POST {message, history}
        
        alt HTTP 200 OK
            MW-->>CL: {response, category}
            
            alt Specialized Category
                CL-->>U: πŸ”„ "Connecting to Specialist..."
                CL->>CL: ⏳ Wait 2s (UX Delay)
            else General Category
                CL->>CL: Remove Status Message
            end
            
            CL-->>U: πŸ’¬ AI Response
            CL->>S: Store Assistant Response
            
        else HTTP Error
            CL-->>U: ⚠️ Service Unavailable
        else Timeout (45s)
            CL-->>U: ⚠️ Timeout Error
        else Connection Failed
            CL-->>U: ⚠️ Connection Error
        end
    end
Loading

Request Processing Flowchart

flowchart TD
    A([πŸ“¨ User Message Received]) --> B[Update Session History<br/>with User Message]
    B --> C[Display Status:<br/>'🧠 Thinking...']
    C --> D[Prepare JSON Payload<br/>& Auth Headers]
    D --> E{Async POST<br/>to Backend API}
    
    E -->|βœ… Status 200| F[Parse JSON Response]
    E -->|❌ Status 4xx/5xx| G[/"⚠️ Service Unavailable"https://github.com/]
    E -->|⏱️ Timeout 45s| H[/"⚠️ Timeout Error"https://github.com/]
    E -->|πŸ”Œ Network Error| I[/"⚠️ Connection Error"https://github.com/]
    
    F --> J{Check Response<br/>Category}
    
    J -->|Specialized<br/>Orders/Returns/etc| K[Update Status:<br/>'πŸ”„ Connecting to Specialist...']
    J -->|General or<br/>AccountProfileOther| L[Remove Status<br/>Message]
    
    K --> M[⏳ Transition Delay<br/>2 seconds]
    M --> N
    L --> N[Send AI Response<br/>to User]
    
    N --> O[Update Session History<br/>with Assistant Response]
    O --> P([βœ… Ready for Next Message])
    
    G --> P
    H --> P
    I --> P

    style A fill:#e3f2fd
    style P fill:#c8e6c9
    style G fill:#ffcdd2
    style H fill:#ffcdd2
    style I fill:#ffcdd2
Loading

Application State Diagram

stateDiagram-v2
    [*] --> Initializing: on_chat_start

    Initializing --> ConfigError: Missing ENV Variables
    Initializing --> Ready: βœ… Config Valid

    ConfigError --> [*]: Session Ends
    
    Ready --> Processing: User Message Received
    
    Processing --> Specialist: Category = Specialized
    Processing --> GeneralResponse: Category = General
    Processing --> ErrorState: API Failure
    
    Specialist --> TransitionDelay: Show "Connecting..."
    TransitionDelay --> Responding: After 2s
    
    GeneralResponse --> Responding: Immediate
    
    Responding --> Ready: βœ… Response Sent
    
    ErrorState --> Ready: ⚠️ Error Displayed

    note right of Processing
        Async HTTP Request
        with 45s Timeout
    end note

    note right of Specialist
        Categories like:
        - OrderTracking
        - Returns
        - ProductInquiry
    end note
Loading

Component Interaction

graph TB
    subgraph "Environment Configuration"
        ENV[".env File"]
        ENV --> |BACKEND_API_URL| CFG
        ENV --> |BACKEND_API_SECRET| CFG
        ENV --> |BOT_NAME| CFG
        ENV --> |BOT_ROLE| CFG
        CFG["Config Class"]
    end

    subgraph "Chainlit Application"
        CFG --> APP["app.py"]
        
        subgraph "Event Handlers"
            START["@on_chat_start"]
            MSG["@on_message"]
        end
        
        subgraph "State Functions"
            GET["get_history()"]
            UPD["update_history()"]
        end
        
        APP --> START
        APP --> MSG
        MSG --> GET
        MSG --> UPD
    end

    subgraph "Session Layer"
        SESS["cl.user_session"]
        GET <--> SESS
        UPD --> SESS
    end

    subgraph "External Communication"
        HTTP["httpx.AsyncClient"]
        MSG --> HTTP
        HTTP --> |POST| API["Backend API"]
        API --> |JSON| HTTP
    end

    style ENV fill:#fff3e0
    style CFG fill:#e8f5e9
    style APP fill:#e3f2fd
    style SESS fill:#f3e5f5
    style API fill:#fce4ec
Loading

Error Handling Flow

flowchart LR
    subgraph "Request Phase"
        REQ[HTTP Request] --> |Try| SEND[Send to Backend]
    end
    
    subgraph "Response Handling"
        SEND --> |200| OK[βœ… Process Response]
        SEND --> |4xx/5xx| ERR1[❌ HTTP Error]
        SEND --> |TimeoutException| ERR2[⏱️ Timeout]
        SEND --> |Exception| ERR3[πŸ”Œ Connection Error]
    end
    
    subgraph "User Feedback"
        OK --> |Success| RESP[πŸ’¬ Show AI Response]
        ERR1 --> |Log + Display| MSG1["⚠️ Service Unavailable"]
        ERR2 --> |Log + Display| MSG2["⚠️ Timeout Message"]
        ERR3 --> |Log + Display| MSG3["⚠️ Connection Error"]
    end

    style OK fill:#c8e6c9
    style ERR1 fill:#ffcdd2
    style ERR2 fill:#ffcdd2
    style ERR3 fill:#ffcdd2
Loading

πŸš€ Getting Started

Prerequisites

  • Python 3.9 or higher
  • pip (Python Package Manager)
  • A running instance of the AI Middleware (Backend)

1. Installation

Clone the repository and navigate to the project directory:

git clone https://github.com/hitthecodelabs/EnterpriseAICchat-Chainlit.git
cd EnterpriseAICchat-Chainlit

Create a virtual environment:

# MacOS/Linux
python3 -m venv venv
source venv/bin/activate

# Windows
python -m venv venv
.\venv\Scripts\activate

Install dependencies:

pip install -r requirements.txt

2. Configuration

Create a .env file in the root directory. You can use the provided example as a template:

cp .env.example .env

Required Variables:

Variable Description Example
BACKEND_API_URL Endpoint of your AI Middleware https://api.yourcompany.com/chat
BACKEND_API_SECRET Secure key to authenticate requests sk_prod_12345...
BOT_NAME Name displayed to the user Maria S.
BOT_ROLE Role description in welcome message Support Specialist

3. Usage

Run the application locally with hot-reloading enabled:

chainlit run app.py -w

The interface will be available at: http://localhost:8000.

πŸ›  Project Structure

β”œβ”€β”€ .env.example       # Template for environment variables (Safe to commit)
β”œβ”€β”€ .gitignore         # Ignores .env and venv (Security best practice)
β”œβ”€β”€ app.py             # Main application logic (Async & Stateful)
β”œβ”€β”€ chainlit.md        # Chainlit welcome markdown (Optional)
β”œβ”€β”€ requirements.txt   # Project dependencies
└── README.md          # Project documentation

πŸ“¦ Deployment (Production)

For production environments (e.g., Railway, AWS ECS, Docker), ensure the start command is configured to run headless:

chainlit run app.py --host 0.0.0.0 --port 8000 --headless

Docker Support

This project is Docker-ready. Ensure you pass the environment variables defined in .env securely through your container orchestration platform secrets manager.

πŸ›‘ Disclaimer

This software is provided "as is", without warranty of any kind. It is intended as a frontend template and requires a functioning backend API to process natural language queries.

πŸ“ Consultant Notes for Implementation

  • Customization: Edit the Config class in app.py or update your .env file to change the bot's behavior.
  • Logging: The application uses Python's standard logging library. In a production environment, ensure these logs are piped to a monitoring tool like Datadog or CloudWatch.

About

A robust and asynchronous chat interface built with Chainlit

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages