Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.ruapi.ai/llms.txt

Use this file to discover all available pages before exploring further.

OpenClaw is a personal AI assistant you run on your own devices, accessible via Telegram, WhatsApp, Discord, Signal, iMessage, and more. It supports many model providers — RuAPI plugs in via either the Anthropic-compatible or OpenAI-compatible protocol.

Prerequisites

1

RuAPI account + API key

Console → TokensCreate token. See API keys & security.
2

Node.js ≥ 18

node --version
Install from nodejs.org if missing.
3

OS

Native macOS and Linux. On Windows, use WSL2.

Install OpenClaw

npm install -g openclaw@latest
Confirm:
openclaw --version

Configure RuAPI

OpenClaw uses the Anthropic SDK under the hood, so the simplest setup is ANTHROPIC_BASE_URL + ANTHROPIC_API_KEY env vars plus a model selection in ~/.openclaw/openclaw.json.
1

Run onboarding to create the config directory

openclaw onboard
Walk through the wizard. When it asks for a model provider, pick anything (e.g. Anthropic API key) with a placeholder key — you’ll overwrite it next. This step exists to create ~/.openclaw/ and the initial config.
2

Edit openclaw.json

Open ~/.openclaw/openclaw.json (create if missing) and set:
{
  env: {
    ANTHROPIC_BASE_URL: "https://www.ruapi.ai",
    ANTHROPIC_API_KEY: "sk-your-RuAPI-token",
  },
  agents: {
    defaults: {
      model: { primary: "anthropic/claude-sonnet-4-6" },
    },
  },
}
Don’t append /v1 to ANTHROPIC_BASE_URL.
3

Restart the gateway

If you installed it as a daemon:
openclaw onboard --install-daemon
Or start it manually:
openclaw gateway:start

Method 2: shell env vars

If you’d rather not edit the config file, set the vars in your shell:
Add these to ~/.zshrc or ~/.bashrc:
export ANTHROPIC_BASE_URL="https://www.ruapi.ai"
export ANTHROPIC_API_KEY="sk-your-key"
Then source the file or open a new terminal.
You still need agents.defaults.model in openclaw.json, otherwise OpenClaw will prompt you to pick a model on every run.

First call

Send one message:
openclaw agent --message "Say hi in one English sentence" --local
You should get a reply, and the RuAPI Console → Logs should show a claude-sonnet-4-6 (or whatever you set) call. Once you’ve wired up Telegram / Discord / etc., messages on those channels will route through RuAPI too.

Choosing a different model

Edit ~/.openclaw/openclaw.json:
{
  agents: {
    defaults: {
      model: { primary: "anthropic/claude-opus-4-7" },
    },
  },
}
OpenClaw uses <provider>/<model-id> naming. Common options:
Model IDUse
anthropic/claude-sonnet-4-6Workhorse default
anthropic/claude-opus-4-7Heavier tasks
anthropic/claude-haiku-4-5Fast and cheap
Full list at the model gallery.

Troubleshooting

  • ANTHROPIC_API_KEY must be your RuAPI token (sk-...), not Anthropic’s official key.
  • Confirm the key is enabled and the model is in any allowlist via RuAPI Console → Tokens.
  • The model ID must include the anthropic/ prefix, e.g. anthropic/claude-sonnet-4-6.
  • Don’t add /v1 to ANTHROPIC_BASE_URL.
  • Check connectivity: curl -I https://www.ruapi.ai.
  • Cancel onboarding (Ctrl+C) and hand-edit ~/.openclaw/openclaw.json — the next start will use that file.
OpenClaw’s models.providers config supports fallback chains. Get RuAPI working first, then layer fallbacks per the OpenClaw configuration docs.
Define a custom provider in openclaw.json:
{
  models: {
    providers: {
      ruapi: {
        api: "openai",
        baseUrl: "https://www.ruapi.ai/v1",
        apiKey: "sk-your-key",
        models: ["gpt-5", "gpt-5-mini"],
      },
    },
  },
  agents: {
    defaults: { model: { primary: "ruapi/gpt-5" } },
  },
}

Advanced: channels

OpenClaw’s main value is “use Claude via your messaging app.” With RuAPI configured, wire up Telegram / Discord / Signal per the OpenClaw channels docs — model calls route through RuAPI automatically.