████ █>>█ █>>█ ████
On-Demand Compute Engine
SERVER URL
http://localhost:3000/api/mcp-server/compute-engineCONNECTION JSON
{
"mcpServers": {
"compute-engine": {
"command": "npx",
"args": [
"mcp-remote@latest",
"http://localhost:3000/api/mcp-server/compute-engine"
]
}
}
}Learn more about supported MCP clients
MCP HOST CONNECTIONS
Connect this MCP server directly to your AI assistant tools by using the JSON config above. Click on a logo to see a demo video.
╔═══╗ ║ ▸ ║ ║ │ ║ ╚═══╝
┌───┐ │ C │ │ L │ └───┘
≋≋≋≋ ~≋~≋ ≋~≋~ ~~~~
DESCRIPTION
Serverless compute platform with x402 micropayments. Run Python, Node.js, Rust code on-demand. Train ML models, process large datasets, or run intensive computations. Pay only for execution time.
AVAILABLE TOOLS
x402_compute_credits
Check compute credits and payment status
execute_code
Execute code (Python/Node/Rust) with x402 payment
train_model
Train ML model on GPUs (charges via x402)
batch_process
Process large dataset (charges via x402)
CODE EXAMPLES
// Compute Engine with x402
import { Client } from '@modelcontextprotocol/sdk';
const client = new Client();
await client.connect('https://compute.x402.app/mcp');
// Execute code (auto-pays for compute time)
const result = await client.callTool({
name: 'execute_code',
arguments: {
language: 'python',
code: 'import pandas as pd\ndf.describe()',
wallet: 'YourWallet...'
}
});Official MCP SDK Documentation:
https://github.com/modelcontextprotocol/modelcontextprotocol
Installation:
npm install @modelcontextprotocol/sdk
Authentication:
- For MCP Hosts (Cursor, Windsurf, Claude Desktop), saving the JSON config to your tool will automatically trigger the browser OAuth flow
- For programmatic access, obtain the OAuth token by running "npx mcp-remote <url>" in the terminal and follow the instructions