OpenClaw is a personal AI assistant you run on your own devices, accessible via Telegram, WhatsApp, Discord, Signal, iMessage, and more. It supports many model providers — RuAPI plugs in via either the Anthropic-compatible or OpenAI-compatible protocol.Documentation Index
Fetch the complete documentation index at: https://docs.ruapi.ai/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
RuAPI account + API key
Console → Tokens → Create token. See API keys & security.
Node.js ≥ 18
Install OpenClaw
Configure RuAPI
OpenClaw uses the Anthropic SDK under the hood, so the simplest setup isANTHROPIC_BASE_URL + ANTHROPIC_API_KEY env vars plus a model selection in ~/.openclaw/openclaw.json.
Method 1: env + config file (recommended)
Run onboarding to create the config directory
~/.openclaw/ and the initial config.Method 2: shell env vars
If you’d rather not edit the config file, set the vars in your shell:- macOS / Linux
- WSL2
Add these to Then
~/.zshrc or ~/.bashrc:source the file or open a new terminal.agents.defaults.model in openclaw.json, otherwise OpenClaw will prompt you to pick a model on every run.
First call
Send one message:claude-sonnet-4-6 (or whatever you set) call.
Once you’ve wired up Telegram / Discord / etc., messages on those channels will route through RuAPI too.
Choosing a different model
Edit~/.openclaw/openclaw.json:
<provider>/<model-id> naming. Common options:
| Model ID | Use |
|---|---|
anthropic/claude-sonnet-4-6 | Workhorse default |
anthropic/claude-opus-4-7 | Heavier tasks |
anthropic/claude-haiku-4-5 | Fast and cheap |
Troubleshooting
401 Unauthorized
401 Unauthorized
404 / model not found
404 / model not found
- The model ID must include the
anthropic/prefix, e.g.anthropic/claude-sonnet-4-6. - Don’t add
/v1toANTHROPIC_BASE_URL.
`openclaw onboard` hangs or spins
`openclaw onboard` hangs or spins
- Check connectivity:
curl -I https://www.ruapi.ai. - Cancel onboarding (Ctrl+C) and hand-edit
~/.openclaw/openclaw.json— the next start will use that file.
Configure multiple providers for failover
Configure multiple providers for failover
OpenClaw’s
models.providers config supports fallback chains. Get RuAPI working first, then layer fallbacks per the OpenClaw configuration docs.Use the OpenAI-compatible protocol instead
Use the OpenAI-compatible protocol instead
Define a custom provider in
openclaw.json: