This repository contains a client implementation for interacting with MCP (Model-Client-Plugin) server scripts. It enables communication between Ollama LLM models and tool-based servers through a standardized interface.
The MCP Client provides a bridge between Ollama language models and server-side tools. It:
- Connects to MCP-compatible server scripts (Python or JavaScript)
- Discovers available tools exposed by the server
- Processes user queries through Ollama LLM
- Detects when the LLM wants to use tools and executes them
- Returns results back to the LLM to generate a final response
- Tool Discovery: Automatically detects and lists tools from connected servers
- Interactive Chat Loop: Provides a command-line interface for interacting with the system
- Timezone Awareness: Includes current time and timezone information with each query
- Error Handling: Robust error handling for tool calls and response processing
- Python 3.7+
- MCP library
- Ollama (with llama3.2:3b-instruct-q8_0 model or similar)
- ZoneInfo library (Python 3.9+ or backported)
-
Install the required dependencies:
pip install mcp-client ollama zoneinfo
-
Ensure Ollama is installed and the specified model is downloaded:
ollama pull llama3.2:3b-instruct-q8_0
Run the client by specifying the path to an MCP-compatible server script:
python client.py path/to/server_script.pyThe client supports both Python (.py) and JavaScript (.js) server scripts.
The main class handling the MCP client functionality.
Initializes the client with default configuration.
Connects to an MCP server specified by the script path.
Parameters:
server_script_path(str): Path to the server script (.py or .js)
Raises:
ValueError: If the server script is not a .py or .js file
Processes a user query using Ollama and available tools.
Parameters:
query(str): The user's input query
Returns:
- String containing the LLM's response, potentially including tool execution results
Runs an interactive chat loop for continuous user interaction.
Cleans up resources and connections.
Retrieves the current date, time, and timezone information.
Returns:
- Formatted string with local time in both 12-hour and ISO formats, along with timezone abbreviation
The LLM will call tools using a specific format:
---TOOL_START---
TOOL: tool_name
INPUT: {"key": "value"}
---TOOL_END---
The client parses this format, validates the tool exists, and then calls the appropriate tool with the provided input parameters.
The client handles various error scenarios:
- Invalid JSON in tool input
- Non-existent tools
- Server connection issues
- General exceptions during tool execution
- Client connects to an MCP server
- User enters a query
- Query is sent to Ollama with available tools information
- Ollama decides if a tool is needed
- If yes, the client extracts the tool call and executes it
- Tool results are sent back to Ollama for final response
- Complete response is displayed to the user
- Change the Ollama model by modifying
self.ollama_modelin the__init__method - Adjust the system prompt in
process_queryto change how the LLM interacts with tools - Modify
get_current_time()if different timezone handling is required
- Only supports synchronous tool calls (one at a time)
- Limited to CLI interaction
- Requires server script to follow MCP protocol