TARXDocs
⌘Ktarx.com →

Search Docs

Skip to Content
start

First local request

Make the first OpenAI-compatible request against the TARX runtime on your Computer.

Available nowcriticalUpdated 2026-05-13

The first developer win is one local request. Keep the OpenAI-compatible shape, change the base URL, and confirm TARX answers through the Computer route.

Local endpoint

Start with the runtime on loopback. Do not route early setup through hosted Supercomputer.

curl http://127.0.0.1:11440/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model":"tarx-local","messages":[{"role":"user","content":"Hello from my Computer"}]}'

Expected result

A successful response proves the local contract is reachable. If local retrieval or memory is unavailable, report that as runtime status instead of pretending the result was a clean empty search.

Next

Add a project key for external integrations, then decide whether the workload needs Supercomputer.