MCP server for docs

Your docs, inside every AI tool

Turn your documentation into an MCP server. Developers query your knowledge base from Claude, Cursor, Windsurf, or any MCP-compatible tool, and get answers grounded in your actual content. One hosted endpoint, live in 15 minutes.

14-day freeSelf-serveLive in 15 min
Claude · biel-docs MCP
D
How do I configure the rate_limit on a route in Acme's API? Need to support burst windows.
Called biel-docs.search_docs · 2 sources

PATCH the route with rate_limit. Each route has its own bucket scoped to the path; authenticated and anonymous traffic are metered separately.

curl -X PATCH https://api.acme.dev/v1/routes/checkout \
  -d '{"rate_limit":{"requests":100,"per":"minute","burst":50}}'

The burst field caps transient surges. When clients exceed the limit, the API returns 429 with a Retry-After header.

api/rate-limits.md § Per-route configurationapi/config.md § rate_limit reference

Used by teams that take their documentation seriously

GrepTime
ScyllaDB
SecuroSys
Katalon
Talon.one
Tezos
CrazyGames
GrepTime
ScyllaDB
SecuroSys
Katalon
Talon.one
Tezos
CrazyGames
Without MCP vs with MCP

The same question, two days apart.

Today, developers ask their AI assistant a question, get a generic answer, then go hunt your docs to confirm. Tomorrow, the same assistant queries your docs first and grounds the answer. The difference is one MCP endpoint.

Without MCP · generic LLM
Guess
how do I configure rate_limit on an Acme route?
No tools available · answering from training data
For most APIs you'll set a rate_limit field somewhere on the route or endpoint config. Common patterns are {"requests": N, "per": "minute"} or middleware-based limiters like express-rate-limit.
// Generic example — verify against Acme's actual API
app.use('/checkout', rateLimit({
  windowMs: 60 * 1000,
  max: 100,
}));
You'll want to check Acme's documentation for the specific configuration shape.
Sources
No documentation accessible · suggested reading: Acme API reference
Generic guess · no groundingDeveloper goes to docs →
With MCP · Claude + biel-docs
Answer
how do I configure rate_limit on an Acme route?
Called biel-docs.search_docs · grounded in 2 sources
Set rate_limit on the route. Each endpoint has its own bucket scoped to the route path; authenticated and anonymous traffic are metered separately.
# PATCH a single route
curl -X PATCH https://api.acme.dev/v1/routes/checkout \
  -H "Authorization: Bearer sk_..." \
  -d '{"rate_limit":{"requests":100,"per":"minute"}}'
For burst handling, add burst (transient surge cap). Limits are enforced at the edge before reaching your handler.
Sources
api/rate-limits.md§ Per-route configuration
api/config.md§ rate_limit reference
1 synthesized answer · 2 sourcesUser has the answer ✓
Compatible tools

One server. Every AI tool.

Set up your MCP server once. It works with every tool that supports the Model Context Protocol, today and the ones that ship next quarter.

Claude Desktop

Chat with your docs in Claude. Configure once in claude_desktop_config.json.

Claude Code

CLI-powered doc lookups while you're heads-down in a terminal.

Cursor

Docs context inside the editor. Tab-complete with awareness of your stack.

Windsurf

AI coding with your docs in scope. Cascade can read pages, not guess.

VS Code + Copilot

Copilot grounded in your docs through the MCP extension.

Cline

Autonomous agents with read access to your docs as a tool.

How it works

Three steps, fifteen minutes.

No infrastructure. No Docker containers. No self-hosting. Connect your docs, get a hosted MCP endpoint, paste it into your AI client.
  1. 01

    Connect your docs

    Paste your documentation URL. Biel.ai crawls, chunks, and indexes your content automatically. Twenty-plus platforms supported out of the box.Create an account
  2. 02

    Get your MCP endpoint

    Biel.ai generates a hosted MCP server URL for your project. No Docker images, no infrastructure to manage, no scaling to worry about.
    https://mcp.biel.ai/v2/<project>/mcp
  3. 03

    Add it to any AI tool

    Paste the URL into Claude, Cursor, or any MCP client. Your team gets answers grounded in your actual docs, same index as the chatbot widget.
Supported platforms

Every documentation source. One MCP server.

Your docs are probably scattered. A Docusaurus site, an internal Confluence space, a few GitHub repos, maybe a Notion handbook. Biel indexes all of them and exposes one endpoint to your AI tools.

Things teams ask during evaluation

What is the Model Context Protocol (MCP)?

MCP is an open protocol introduced by Anthropic that lets AI assistants connect to external tools and data sources in a standardized way. Instead of every model and every data source defining a custom integration, MCP gives them a shared language. Your docs become one of those data sources.

Do I need to self-host the MCP server?

No. Every Biel.ai project ships with a hosted MCP endpoint at mcp.biel.ai. You don't run Docker containers, manage scaling, or operate infrastructure. Paste the URL into your AI client and you're done.

Which AI tools support MCP today?

Claude Desktop, Claude Code, Cursor, Windsurf, VS Code with the right extension, Cline, and Zed all support MCP natively. The list grows steadily; anything that speaks the protocol works without changes on your end.

How does the MCP server stay in sync with my docs?

Biel.ai recrawls your docs on a schedule (configurable per source) and on-demand via webhook. Updates flow into the same index the chatbot uses, so MCP queries always see the same content readers see.

Can multiple team members use the same MCP server?

Yes. The endpoint is shared per project. Each team member adds the same URL to their MCP client config. For internal docs, scope a token to your workspace; for public docs, no auth is required.

Can I expose internal docs alongside public docs?

Yes. Mix Confluence, Notion, GitHub repos, and your public docs in the same project. The MCP server can serve all of them under one endpoint, or you can split internal and external into separate projects with separate tokens.

Does MCP replace the chatbot widget?

No, they complement each other. The widget serves readers on your docs site. MCP serves developers inside their AI tools. Same index, two surfaces. Most teams ship both, answering questions wherever the user happens to be.

Is the MCP server secure?

Yes. Read-only access, scoped per project token. EU-hosted by default. Content is never used to train models. For internal docs, tokens can be rotated and revoked from the dashboard.

Turn your docs into an MCP server.

14-day free trial · No credit card required · From $50/mo
Try me ↓