mcpt: The curated registry for MCP servers
Discover quality, up-to-date MCP servers with a curated registry.
We’re making it easier to build with MCP.
MCP servers connect AI apps with your product, and earlier this month we launched a way to generate them directly from your documentation.
Today, we’re introducing mcpt—a managed registry where people can explore high quality MCP servers in one place.
Navigate the fragmented MCP ecosystem
Model Context Protocol (MCP) has exploded in popularity in just a few months since Anthropic’s first introduced it a few months ago.
But at this critical moment when developers are most eager to learn and build, it's too hard for them to find quality servers. Some MCP registries exist, but they're either too small to be useful, or too crowded with user-generated servers that it's hard to find what you need.
mcpt is a managed registry that brings together high-quality third-party servers and official MCP servers generated directly from our customers' documentation.
What sets our curated directory apart is that Mintlify-generated MCP servers are always in sync with the latest updates—whenever documentation changes, the corresponding server updates automatically. This means anyone browsing mcpt always gets the most up-to-date and accurate view of your product’s capabilities.
What are MCP servers and why do they matter?
For those new to MCP, here's a quick refresher: the Model Context Protocol is a standard that provides a universal way for AI applications to connect with external data sources and tools.
MCP follows a client-server architecture:
- MCP Clients like Claude Desktop, Windsurf, or Cursor are the interfaces users interact with
- MCP Servers act as intermediaries between these clients and external data or tools, like an API for LLMs
As AI apps increasingly become the intermediary between your product and users, MCP servers play a crucial role in ensuring your product capabilities are accurately represented.
Optimizing for an AI-first documentation future
At Mintlify, we're rethinking how documentation can serve both human readers and LLMs.
Earlier this month, we launched our MCP Server Generator to automatically create servers from your existing documentation or OpenAPI spec, without any custom code needed.
These docs-based servers enable AI applications to:
- Provide up-to-date answers about your documentation. This is similar to offering AI Chat, but directly in a tool like Claude
- Interact with your API in real-time, such as generating authenticated requests or querying data
Beyond MCP, we’ve released many improvements for AI documentation consumption:
- Auto-hosting /llms.txt — Serves as a sitemap for AI, helping LLMs efficiently index your content
- Auto-hosting /llms-full.txt — Provides your entire documentation as a single markdown file for easier loading into an LLM's context window
- Markdown versions of all pages — Add .md to any page URL or use ⌘ + C to copy content in markdown format for easier context loading
What's next
We're investing in MCP to drive AI adoption across the developer and broader community, as we see it as a fundamental shift in how products integrate with AI tools.
But ahead of widespread usage, MCP faces two key challenges: distribution and usability. We’re tackling the distribution challenge with mcpt, and we have more coming soon for the latter.
It’s never been a better time to build and we’re excited for what’s next.