Connect
Optimize
Secure
Announcing StackOne Defender: leading open-source prompt injection guard for your agent • Read More →
Production-ready Mintlify MCP server with extensible actions — plus built-in authentication, security, and optimized execution.
Coverage
Create, read, update, and delete across Mintlify — and extend your agent's capabilities with custom actions.
Authentication
Per-user OAuth in one call. Your Mintlify MCP server gets session-scoped tokens with zero credentials stored on your infra.
Agent Auth →Security
Every Mintlify tool response scanned for prompt injection in milliseconds — 88.7% accuracy, all running on CPU.
Prompt Injection Defense →Performance
Free up to 96% of your agent's context window to enhance reasoning and reduce cost, on every Mintlify call.
Tools Discovery →A Mintlify MCP server lets AI agents read and write Mintlify data through the Model Context Protocol — Anthropic's open standard for connecting LLMs to external tools. StackOne's Mintlify MCP server ships with pre-built actions, fully extensible via the Connector Builder — plus managed authentication, prompt injection defense, and optimized agent context. Connect it from MCP clients like Claude Desktop, Cursor, and VS Code, or from agent frameworks like OpenAI Agents SDK, LangChain, and Vercel AI SDK.
Every action from Mintlify's API, ready for your agent. Create, read, update, and delete — scoped to exactly what you need.
Trigger a documentation deployment to publish updates from your configured branch
Get the status of a deployment update by its status ID
Initiates an AI-powered background job to automatically generate documentation updates and create pull requests with the changes
Retrieves the current status, progress, and details of a specific agent job by its unique identifier for monitoring documentation generation
Sends additional instructions or refinements to an active agent job to iteratively improve documentation output without creating a new job
Search through documentation using semantic and keyword search
Retrieve the full text content of a documentation page by its path
Generate a streaming response from the AI assistant trained on your documentation
Retrieve user feedback from your documentation including page ratings and code snippet feedback
Retrieve user feedback counts aggregated by documentation page
Retrieve AI assistant conversation history including queries, responses, and cited sources
Retrieve documentation search terms with hit counts, click-through rates, and top clicked pages
Retrieve per-page and site-wide content view counts split by human and AI traffic
Retrieve per-page and site-wide approximate unique visitor counts split by human and AI traffic
One endpoint. Any framework. Your agent is talking to Mintlify in under 10 lines of code.
MCP Clients
Agent Frameworks
{
"mcpServers": {
"stackone": {
"command": "npx",
"args": [
"-y",
"mcp-remote@latest",
"https://api.stackone.com/mcp?x-account-id=<account_id>",
"--header",
"Authorization: Basic <YOUR_BASE64_TOKEN>"
]
}
}
}Anthropic's code_execution processes data already in context. Custom MCP code mode keeps raw tool responses in a sandbox. 14K tokens vs 500.
11 min
Benchmarking BM25, TF-IDF, and hybrid search for MCP tool discovery across 916 tools. The 80/20 TF-IDF/BM25 hybrid hits 21% Top-1 accuracy in under 1ms.
10 min
MCP tools that read emails, CRM records, and tickets are indirect prompt injection vectors. Here's how we built a two-tier defense that scans tool results in ~11ms.
12 min
origin_owner_id.All the tools you need to build and scale AI agent integrations, with best-in-class connectivity, execution, and security.