MCP Server

The Kova MCP server lets AI assistants like Claude, Cursor, and other MCP-compatible tools manage your deployments, browse providers, and interact with AI endpoints directly.

What is MCP?

The Model Context Protocol is an open standard that allows AI assistants to call tools exposed by external servers. Kova's MCP server exposes 17 tools covering deployments, providers, AI inference, and account management -- all over a simple JSON-RPC stdio transport.

Setup

1

Get an API Key

Create an API key from the dashboard at Settings → API Keys, or use the API directly:

curl -X POST https://app.kovanetwork.com/api/v1/api-keys \
  -H "Authorization: Bearer YOUR_JWT_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"name": "mcp-server"}'
2

Install the Package

Install the Kova MCP server globally:

npm install -g @kova/mcp

Or use it directly with npx (no install needed):

npx @kova/mcp
3

Configure Your AI Client

Add the server to your MCP client config. For Claude Code, add to ~/.claude.json:

{
  "mcpServers": {
    "kova": {
      "command": "npx",
      "args": ["@kova/mcp"],
      "env": {
        "KOVA_API_TOKEN": "kova_xxxx.xxxxxxxx"
      }
    }
  }
}

For Cursor, add to .cursor/mcp.json in your project:

{
  "mcpServers": {
    "kova": {
      "command": "npx",
      "args": ["@kova/mcp"],
      "env": {
        "KOVA_API_TOKEN": "kova_xxxx.xxxxxxxx"
      }
    }
  }
}
4

Verify Connection

Once configured, your AI assistant should list the Kova tools. Try asking it to "list my deployments" or "show network providers".

Environment Variables

VariableDescriptionDefault
KOVA_API_TOKENYour API key for authentication(required)
KOVA_API_URLBase URL for the Kova APIhttps://app.kovanetwork.com

Token Required

The server will start without a token but all authenticated endpoints will fail. Always set KOVA_API_TOKEN before connecting.

Available Tools

Deployments

ToolDescription
list_deploymentsList all your deployments with optional state filter
get_deploymentGet detailed info about a deployment (SDL, leases, resources)
create_deploymentCreate a new deployment from an SDL manifest
close_deploymentTerminate a deployment and refund remaining escrow
get_deployment_logsFetch recent logs from a deployment service
estimate_costEstimate hourly/monthly cost for an SDL manifest

AI Inference

ToolDescription
list_ai_modelsBrowse available AI models (LLMs, image gen, audio, embeddings)
deploy_ai_modelDeploy a model as an inference endpoint
list_ai_endpointsList your active inference endpoints
delete_ai_endpointDelete an inference endpoint

AI Chat

ToolDescription
create_chat_sessionStart a chat session with a model
send_chat_messageSend a message and get a response
list_chat_sessionsList your chat sessions

Network & Account

ToolDescription
list_providersBrowse compute providers with optional GPU filter
list_templatesBrowse deployment templates by category
get_network_statsGet network stats (nodes, deployments, capacity)
get_account_balanceGet your balance and escrow totals

Example Usage

Once connected, you can interact naturally with your AI assistant:

  • "Deploy a Node.js app with 1 CPU and 512MB RAM"
  • "Show me providers with RTX 4090 GPUs"
  • "What are the logs for my deployment abc123?"
  • "How much would it cost to run a 2 CPU, 4GB RAM deployment?"
  • "Deploy llama-3-8b as an inference endpoint"
  • "Close deployment xyz789"

Running Standalone

You can also run the server directly for testing:

KOVA_API_TOKEN=kova_xxxx.xxxxxxxx npx @kova/mcp

Or if installed globally:

KOVA_API_TOKEN=kova_xxxx.xxxxxxxx kova-mcp

The server reads JSON-RPC messages from stdin and writes responses to stdout. Diagnostic messages go to stderr.

Protocol Details

  • Transport: stdio (stdin/stdout)
  • Protocol: JSON-RPC 2.0
  • MCP Version: 2024-11-05
  • Server ID: kova-network v1.0.0

The server supports the standard MCP lifecycle: initializenotifications/initializedtools/listtools/call, plus ping for health checks.