Skip to main content

Generate an MCP Server with liblab CLI

You can generate an MCP (Model Context Protocol) server with liblab in two ways, using the liblab CLI or the liblab hosting service.

The CLI option creates a local MCP server you can run in your own environment. The hosting service, on the other hand, instantly deploys a cloud-hosted MCP server and gives you a public URL that AI clients like Claude or tools like Cursor can use to access your API.

If you need a shareable, ready-to-use MCP server with no infrastructure setup, the hosted option is the best choice.

Both methods allow you to generate a fully functional MCP server from any OpenAPI specification, enabling natural language interaction with your API through AI tools.

This guide focuses on generating an MCP server using the liblab CLI.

The MCP server generated by liblab:

  • Automatically defines the MCP tools based on your API spec.
  • Makes them discoverable and usable by AI assistants like Claude or Cursor.

Before You Start

To generate an MCP server with liblab, you need:

  • A liblab account.
  • Your API’s OpenAPI spec, either in YAML or JSON format.

If you don’t have an OpenAPI spec yet, you can follow our framework guides to generate one. Alternatively, you can experiment with public specifications such as:

Configure MCP for a New Project

First, create a separate folder for your SDK project and initialize the directory:

mkdir mcp-server
cd mcp-server
liblab init -s path/or/url/to/your/openapi.json

Follow the onscreen instructions. The liblab CLI will ask if you want to generate the Model Context Protocol (MCP) server. Press y for yes.

The CLI will then create a liblab.config.json file with all your configuration settings in the current directory. It will also prompt you to generate the SDKs. Answer yes again, and liblab will generate both the SDK and the MCP.

An example of the entire process

liblab CLI
liblab init -s path/or/url/to/your/openapi.json
Let's build your SDK with a few simple steps!

? Select languages for SDK generation <Selected-Programming-Languages>
? Please enter your project name Python
? Select the project licence <Selected-Licence>
? Generate Model Context Protocol (MCP) server to enable SDK integration with AI models? Yes
Successfully created liblab.config.json file.
? Do you want to build the SDK now? Yes
No hooks found, SDKs will be generated without hooks.

Your SDKs are being generated. Visit the liblab portal to view more details on your build(s).

✓ Python succeeded
✓ MCP succeeded

✨ Successfully generated SDKs for MCP, PYTHON! ♡ You can find them inside: ./output

Configure MCP for an Existing Project

If you already have an existing liblab project you can simply set mcp.generate to true and then run liblab build -y to build your MCP server and SDKs like normal. MCP generation will now be a part of your standard build process.

liblab config mcp.generate true
liblab build -y

MCP Directory Structure

After the build, your output folder will look like this:

output/
├── <your-sdk>
└── mcp
├── package.json # NPM project configuration
├── README.md # MCP setup instructions
├── src # Generated MCP server
│   └── index.ts # MCP server entry point
└── sdk # SDK used by the MCP server

Inside mcp/README.md, you'll find setup instructions and a list of available tools.

Running the MCP Server

Navigate to the MCP folder and run the setup script to install dependencies and build:

cd output/mcp
npm run setup

Connect the MCP Server to AI Tools

That's all there is to it. Your MCP server is built and ready to use. Now try out your MCP server by connecting it to your favorite AI tools:

You can also learn more about how MCP works