This repository contains a robust, asynchronous chat interface built with Chainlit. It is designed to serve as a secure frontend orchestrator between end-users and a proprietary AI Middleware.
The solution is engineered to handle Customer Service workflows (e.g., Order Tracking, Returns, General Inquiries) with a focus on high concurrency, state management, and user experience (UX).
- β‘ Asynchronous Architecture: Utilizes
httpxandasynciofor non-blocking I/O, ensuring high throughput and low latency compared to traditional synchronous requests. - π§ Session State Management: Implements persistent conversation history within the user session, allowing the AI to maintain context (memory) across multiple interactions.
- π Security by Design: Strict separation of configuration and code via Environment Variables. No sensitive URLs or API keys are hardcoded.
- β¨ Optimistic UI & UX: Features transitional states ("Thinking...", "Connecting to Specialist...") to manage user expectations and reduce perceived latency.
- βοΈ Configurable Persona: The bot's name, role, and welcome message are fully customizable via environment variables without touching the codebase.
The application acts as a stateless frontend layer that forwards user intent and history to a backend logic tier.
%%{init: {'theme': 'base', 'themeVariables': { 'primaryColor': '#ffffff', 'edgeLabelBackground':'#ffffff', 'tertiaryColor': '#f4f4f4'}}}%%
graph LR
%% DefiniciΓ³n de Estilos (Clases)
classDef person fill:#007bff,stroke:#0056b3,stroke-width:2px,color:#fff,rx:10,ry:10;
classDef frontend fill:#e3f2fd,stroke:#2196f3,stroke-width:2px,color:#0d47a1,rx:5,ry:5;
classDef backend fill:#fff3e0,stroke:#ff9800,stroke-width:2px,color:#e65100,rx:5,ry:5;
classDef db fill:#f3e5f5,stroke:#9c27b0,stroke-width:2px,stroke-dasharray: 5 5;
%% Nodos
User("π€ End User"):::person
subgraph ClientSide ["π₯οΈ Frontend Layer"]
direction TB
CL["Chainlit App"]:::frontend
SS[("Session Storage")]:::db
end
subgraph ServerSide ["βοΈ Backend Infrastructure"]
direction TB
MW["π§ AI Middleware"]:::backend
AI["π€ AI Models"]:::backend
end
%% Relaciones
User <==>|WebSocket| CL
CL <==>|Async HTTP/JSON| MW
CL -.->|Persistencia| SS
MW <--> AI
%% Enlace invisible para alinear subgrafos si es necesario
ClientSide ~~~ ServerSide
sequenceDiagram
autonumber
participant U as π€ User
participant CL as π₯οΈ Chainlit
participant S as π¦ Session
participant MW as π§ Middleware
rect rgb(230, 245, 255)
Note over CL: Chat Initialization
CL->>CL: Validate ENV Variables
alt Missing Config
CL-->>U: β οΈ System Error
else Config Valid
CL->>S: Initialize Empty History
CL-->>U: π Welcome Message
end
end
rect rgb(255, 243, 224)
Note over U,MW: Message Processing Loop
U->>CL: Send Message
CL->>S: Store User Message
CL-->>U: π§ "Thinking..."
CL->>MW: POST {message, history}
alt HTTP 200 OK
MW-->>CL: {response, category}
alt Specialized Category
CL-->>U: π "Connecting to Specialist..."
CL->>CL: β³ Wait 2s (UX Delay)
else General Category
CL->>CL: Remove Status Message
end
CL-->>U: π¬ AI Response
CL->>S: Store Assistant Response
else HTTP Error
CL-->>U: β οΈ Service Unavailable
else Timeout (45s)
CL-->>U: β οΈ Timeout Error
else Connection Failed
CL-->>U: β οΈ Connection Error
end
end
flowchart TD
A([π¨ User Message Received]) --> B[Update Session History<br/>with User Message]
B --> C[Display Status:<br/>'π§ Thinking...']
C --> D[Prepare JSON Payload<br/>& Auth Headers]
D --> E{Async POST<br/>to Backend API}
E -->|β
Status 200| F[Parse JSON Response]
E -->|β Status 4xx/5xx| G[/"β οΈ Service Unavailable"https://github.com/]
E -->|β±οΈ Timeout 45s| H[/"β οΈ Timeout Error"https://github.com/]
E -->|π Network Error| I[/"β οΈ Connection Error"https://github.com/]
F --> J{Check Response<br/>Category}
J -->|Specialized<br/>Orders/Returns/etc| K[Update Status:<br/>'π Connecting to Specialist...']
J -->|General or<br/>AccountProfileOther| L[Remove Status<br/>Message]
K --> M[β³ Transition Delay<br/>2 seconds]
M --> N
L --> N[Send AI Response<br/>to User]
N --> O[Update Session History<br/>with Assistant Response]
O --> P([β
Ready for Next Message])
G --> P
H --> P
I --> P
style A fill:#e3f2fd
style P fill:#c8e6c9
style G fill:#ffcdd2
style H fill:#ffcdd2
style I fill:#ffcdd2
stateDiagram-v2
[*] --> Initializing: on_chat_start
Initializing --> ConfigError: Missing ENV Variables
Initializing --> Ready: β
Config Valid
ConfigError --> [*]: Session Ends
Ready --> Processing: User Message Received
Processing --> Specialist: Category = Specialized
Processing --> GeneralResponse: Category = General
Processing --> ErrorState: API Failure
Specialist --> TransitionDelay: Show "Connecting..."
TransitionDelay --> Responding: After 2s
GeneralResponse --> Responding: Immediate
Responding --> Ready: β
Response Sent
ErrorState --> Ready: β οΈ Error Displayed
note right of Processing
Async HTTP Request
with 45s Timeout
end note
note right of Specialist
Categories like:
- OrderTracking
- Returns
- ProductInquiry
end note
graph TB
subgraph "Environment Configuration"
ENV[".env File"]
ENV --> |BACKEND_API_URL| CFG
ENV --> |BACKEND_API_SECRET| CFG
ENV --> |BOT_NAME| CFG
ENV --> |BOT_ROLE| CFG
CFG["Config Class"]
end
subgraph "Chainlit Application"
CFG --> APP["app.py"]
subgraph "Event Handlers"
START["@on_chat_start"]
MSG["@on_message"]
end
subgraph "State Functions"
GET["get_history()"]
UPD["update_history()"]
end
APP --> START
APP --> MSG
MSG --> GET
MSG --> UPD
end
subgraph "Session Layer"
SESS["cl.user_session"]
GET <--> SESS
UPD --> SESS
end
subgraph "External Communication"
HTTP["httpx.AsyncClient"]
MSG --> HTTP
HTTP --> |POST| API["Backend API"]
API --> |JSON| HTTP
end
style ENV fill:#fff3e0
style CFG fill:#e8f5e9
style APP fill:#e3f2fd
style SESS fill:#f3e5f5
style API fill:#fce4ec
flowchart LR
subgraph "Request Phase"
REQ[HTTP Request] --> |Try| SEND[Send to Backend]
end
subgraph "Response Handling"
SEND --> |200| OK[β
Process Response]
SEND --> |4xx/5xx| ERR1[β HTTP Error]
SEND --> |TimeoutException| ERR2[β±οΈ Timeout]
SEND --> |Exception| ERR3[π Connection Error]
end
subgraph "User Feedback"
OK --> |Success| RESP[π¬ Show AI Response]
ERR1 --> |Log + Display| MSG1["β οΈ Service Unavailable"]
ERR2 --> |Log + Display| MSG2["β οΈ Timeout Message"]
ERR3 --> |Log + Display| MSG3["β οΈ Connection Error"]
end
style OK fill:#c8e6c9
style ERR1 fill:#ffcdd2
style ERR2 fill:#ffcdd2
style ERR3 fill:#ffcdd2
- Python 3.9 or higher
pip(Python Package Manager)- A running instance of the AI Middleware (Backend)
Clone the repository and navigate to the project directory:
git clone https://github.com/hitthecodelabs/EnterpriseAICchat-Chainlit.git
cd EnterpriseAICchat-ChainlitCreate a virtual environment:
# MacOS/Linux
python3 -m venv venv
source venv/bin/activate
# Windows
python -m venv venv
.\venv\Scripts\activateInstall dependencies:
pip install -r requirements.txtCreate a .env file in the root directory. You can use the provided example as a template:
cp .env.example .envRequired Variables:
| Variable | Description | Example |
|---|---|---|
BACKEND_API_URL |
Endpoint of your AI Middleware | https://api.yourcompany.com/chat |
BACKEND_API_SECRET |
Secure key to authenticate requests | sk_prod_12345... |
BOT_NAME |
Name displayed to the user | Maria S. |
BOT_ROLE |
Role description in welcome message | Support Specialist |
Run the application locally with hot-reloading enabled:
chainlit run app.py -wThe interface will be available at: http://localhost:8000.
βββ .env.example # Template for environment variables (Safe to commit)
βββ .gitignore # Ignores .env and venv (Security best practice)
βββ app.py # Main application logic (Async & Stateful)
βββ chainlit.md # Chainlit welcome markdown (Optional)
βββ requirements.txt # Project dependencies
βββ README.md # Project documentation
For production environments (e.g., Railway, AWS ECS, Docker), ensure the start command is configured to run headless:
chainlit run app.py --host 0.0.0.0 --port 8000 --headlessThis project is Docker-ready. Ensure you pass the environment variables defined in .env securely through your container orchestration platform secrets manager.
This software is provided "as is", without warranty of any kind. It is intended as a frontend template and requires a functioning backend API to process natural language queries.
- Customization: Edit the
Configclass inapp.pyor update your.envfile to change the bot's behavior. - Logging: The application uses Python's standard
logginglibrary. In a production environment, ensure these logs are piped to a monitoring tool like Datadog or CloudWatch.