Skip to content

duvan-gallego/mcp-api-colombia

Repository files navigation

Api Colombia MCP Server

The Model Context Protocol (MCP) is a standardized protocol for managing context between large language models (LLMs) and external systems. This repository provides an MCP Server for the api-colombia API, allowing you to use the API throught natural language. This MCP server supports the transport types STDIO and Streamable HTTP.

Api Colombia

On its creators words, API Colombia is a public RESTful API that enables users to access a wide range of public information about the country of Colombia.

Getting started

After cloning the project, install all the dependencies

pnpm install

Once all the dependencies are installed, generate the api-colombia client

pnpm prepare

Test it by using the MCP Inspector with the STDIO transport type

npx @modelcontextprotocol/inspector node dist/index.js --stdio

Note: If you make changes to your code, remember to rebuild:

pnpm build

Test it by using the MCP Inspector with the streamable HTTP transport type

  1. Start the HTTP server
pnpm start
  1. Start the MCP inspector
npx @modelcontextprotocol/inspector
  1. Connect the MCP inspector to the local HTTP server

    image

Test it by using LM Studio

  1. Download LM Studio
  2. Download and load the model you want to use for testing. A small but good one at the moment is the Qwen3 4B Thinking 2507 model
  3. Build and start the MCP server
 pnpm build && pnpm start
  1. Add the MCP configuration in LM Studio, putting the following text in the mcp.json file or clicking the Add to LM studio button. You can find more info about this here
{
  "mcpServers": {
    "mcp-api-colombia": {
      "url": "http://localhost:3000/mcp"
    }
  }
}

Add MCP Server mcp-api-colombia to LM Studio

  1. Check that you can see all the available tools, enable and start interacting with them
image image

Note: Since Qwen3 4B is a small model and only has a context length of 32,768, you can't load all the tools at once, and because of that you will need to only load the ones you will work with. When using bigger models that support a bigger context length, that's not a problem.

License

MIT License

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors