Managing Your Store with AI
Use Claude Code and the Vantemo MCP server to manage your e-commerce store with natural language.
Managing Your Store with AI
Vantemo is designed from the ground up to be AI-friendly. Every API endpoint includes
machine-readable descriptions (x-llm-description), the full API spec is available
as structured data, and the upcoming MCP server enables direct AI-to-API communication.
This guide shows how to manage your Vantemo store using AI assistants like Claude.
Why AI + Vantemo?
Traditional e-commerce admin dashboards require clicking through many screens to accomplish simple tasks. With AI-powered management, you can:
- "Show me yesterday's orders over €100" — instant filtered view
- "Create a 20% discount code for returning customers" — one sentence, done
- "Update the price of all hoodies by 10%" — bulk operations in natural language
- "What's my conversion rate this week vs last week?" — analytics on demand
The Three-Tier AI Stack
Vantemo provides AI integration at three levels:
Tier 1: llms.txt (Passive Context)
Every Vantemo docs deployment includes /docs/llms.txt — a machine-readable index
of all documentation pages. AI assistants can fetch this to understand the platform.
A comprehensive /docs/llms-full.txt includes the complete API reference with all
endpoints, parameters, and response schemas.
Tier 2: API Reference (Structured Knowledge)
The OpenAPI 3.1.0 specification includes x-llm-description fields on every endpoint,
providing AI-optimized descriptions that go beyond standard documentation. AI coding
assistants can use the spec to generate correct API calls.
Tier 3: MCP Server (Direct Interaction)
Coming Soon — The MCP server is in development (Phase 4).
The Model Context Protocol server will allow AI assistants to directly interact with your store's API. Instead of generating code that you then run, the AI can execute operations itself — with your approval.
Getting Started Today
Even without the MCP server, you can use AI assistants effectively with Vantemo:
1. Feed the AI Your API Spec
Point your AI assistant at the llms.txt endpoint:
Fetch https://vantemo.com/docs/llms.txt for an overview of the Vantemo API,
then fetch https://vantemo.com/docs/llms-full.txt for the complete reference.2. Provide Your API Key
Share your secret key with the AI assistant in a secure context (e.g., environment variable or local config):
export VANTEMO_API_KEY=vt_sk_live_YOUR_KEY_HERE3. Ask Natural Language Questions
The AI can now generate and execute API calls on your behalf:
- "List all products that are out of stock"
- "Create a new blog post about our summer collection"
- "Show me the top 10 customers by total spend"
- "Set up a webhook for order.created events"
Security Considerations
- Use restricted keys for AI integrations with only the permissions needed
- Set key expiration — rotate AI-facing keys regularly
- Monitor usage — check the API key dashboard for unusual patterns
- Review before executing — always review AI-generated mutations before confirming them, especially for destructive operations
Next Steps
- MCP Setup — Configure the MCP server (coming soon)
- API Reference — Explore the full API
- Authentication — API key best practices