API de proxy

Inicie o proxy HELM, configure clientes e use todos os terminais HTTP.

CLIENTSDK or fetchOpenAI SDKbase_url switchsame request formatSTANDALONE PROXYlocalhost:9090POST /v1/chat/completionsGET /helm/receiptsGET /helm/proofgraphKERNEL APIlocalhost:8080POST /v1/chat/completionsGET /mcp/v1/capabilitiesGET /api/v1/proofgraph/sessionsGOVERNANCEHeaders and receiptsreceipt idoutput hashLamport clockdecision idUPSTREAMProviderOpenAI-compatiblerequest forwardedresponse returnedStandalone proxy focuses on drop-in traffic governance. The kernel surface adds MCP and ProofGraph APIs.
API de proxy: referência técnica desta página.

API de proxy

HELM exposes two closely related HTTP surfaces:

  • helm proxy starts a standalone OpenAI-compatible proxy on port 9090 by default.
  • The full kernel/API surface runs on port 8080 and adds MCP plus ProofGraph endpoints.

Quick setup

helm proxy --upstream https://api.openai.com/v1
# Proxy running at http://localhost:9090

Point any OpenAI SDK to http://localhost:9090/v1 — that's it.


Client configuration

Python (OpenAI SDK)

import openai

client = openai.OpenAI(base_url="http://localhost:9090/v1")

TypeScript (fetch)

const response = await fetch("http://localhost:9090/v1/chat/completions", {
  method: "POST",
  headers: {
    "Content-Type": "application/json",
    Authorization: `Bearer ${process.env.OPENAI_API_KEY}`,
  },
  body: JSON.stringify({
    model: "gpt-4",
    messages: [{ role: "user", content: "List files" }],
  }),
});

MCP clients

For Claude, Cursor, VS Code, and other MCP-compatible tools:

# Claude Desktop — one-click bundle
helm mcp pack --client claude-desktop --out helm.mcpb

# Cursor / VS Code / Windsurf — print config
helm mcp print-config --client cursor

See the full MCP Integration guide.


Endpoints

POST /v1/chat/completions

OpenAI-compatible proxy endpoint. Tool calls are intercepted and policy-checked. Available on both :9090 (standalone proxy) and :8080 (kernel API).

Standalone proxy headers:

Header Description
X-Helm-Receipt-ID Receipt identifier for audit trail
X-Helm-Output-Hash SHA-256 hash of the response
X-Helm-Lamport-Clock Causal ordering clock
X-Helm-Decision-ID Unique decision identifier

Kernel deployments additionally expose X-Helm-Verdict and X-Helm-Policy-Version.

GET /helm/receipts

Returns the local JSONL receipt log when you run helm proxy:

curl http://localhost:9090/helm/receipts

GET /helm/proofgraph

Returns the in-memory ProofGraph summary for the standalone proxy:

curl http://localhost:9090/helm/proofgraph | jq .

GET /api/v1/proofgraph/sessions

Lists ProofGraph sessions on the full kernel/API surface:

curl http://localhost:8080/api/v1/proofgraph/sessions?limit=10 | jq .

GET /api/v1/proofgraph/sessions/{session_id}/receipts

Returns receipts for one ProofGraph session on the full kernel/API surface:

curl http://localhost:8080/api/v1/proofgraph/sessions/<session_id>/receipts | jq .

GET /api/v1/proofgraph/receipts/{receipt_hash}

Returns one receipt by hash on the full kernel/API surface.

GET /mcp/v1/capabilities

Lists governed MCP capabilities:

curl http://localhost:8080/mcp/v1/capabilities | jq '.tools[].name'

POST /mcp/v1/execute

Executes a governed MCP tool call:

curl -X POST http://localhost:8080/mcp/v1/execute \
  -H 'Content-Type: application/json' \
  -d '{"method":"file_read","params":{"path":"/tmp/test.txt"}}' | jq .

GET /healthz

Returns 200 if the proxy or kernel API is running.


Error codes

Status Meaning
200 Success — tool call allowed and proxied
403 Policy denied the tool call (body includes reason_code)
502 Upstream provider error
503 Policy engine error (fail-closed — HELM denies, not crashes)

Next steps

Goal Guide
Verify receipt chains Receipts & Verification
Add HELM to your MCP client MCP Integration
Get running in 5 minutes Quickstart