Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.ruapi.ai/llms.txt

Use this file to discover all available pages before exploring further.

RuAPI supports two API protocols simultaneously: classic OpenAI-compatible and native Anthropic Claude-compatible. The same API key works with both.

OpenAI protocol

POST /v1/chat/completions — industry standard, supported by virtually all SDKs and frameworks.

Claude protocol

POST /v1/messages — native Anthropic format with thinking blocks and MCP support.

Base URL

https://www.ruapi.ai/v1
All endpoints live under /v1. Auth via the Authorization: Bearer sk-... header.

OpenAI protocol

Use this protocol if your code already uses the official OpenAI SDK or a compatible framework (LangChain, LlamaIndex, Vercel AI SDK, etc.).

Endpoint

POST https://www.ruapi.ai/v1/chat/completions

Compatibility

The request/response format matches OpenAI’s Chat Completions API. Supported:
  • messages (with roles system / user / assistant / tool)
  • model — name of any model in the catalog (Claude/Gemini/Grok included — we convert on the fly)
  • stream: true for streaming responses (Server-Sent Events)
  • tools / tool_choice for function calling
  • temperature, top_p, max_tokens and other parameters

Example: Python (OpenAI SDK)

from openai import OpenAI

client = OpenAI(
    api_key="sk-YOUR_KEY",
    base_url="https://www.ruapi.ai/v1",
)

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "system", "content": "You are a friendly assistant."},
        {"role": "user", "content": "Hello! What can you do?"},
    ],
    temperature=0.7,
)

print(response.choices[0].message.content)
print(f"Tokens used: {response.usage.total_tokens}")

Example: Node.js (OpenAI SDK)

import OpenAI from "openai";

const client = new OpenAI({
  apiKey: "sk-YOUR_KEY",
  baseURL: "https://www.ruapi.ai/v1",
});

const response = await client.chat.completions.create({
  model: "claude-3-5-sonnet-20241022", // call Claude through OpenAI protocol
  messages: [
    { role: "user", content: "Explain quantum entanglement simply." },
  ],
});

console.log(response.choices[0].message.content);

Example: curl + streaming

curl https://www.ruapi.ai/v1/chat/completions \
  -H "Authorization: Bearer sk-YOUR_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [{"role": "user", "content": "Write a haiku about code."}],
    "stream": true
  }'

Claude protocol

Use this protocol if your code is built on the Anthropic SDK or you need Claude-specific features (thinking blocks, native tool calls).

Endpoint

POST https://www.ruapi.ai/v1/messages

Compatibility

The request/response format matches Anthropic’s Messages API. Supported:
  • messages array (native Claude format)
  • system as a separate field
  • model — Claude model name or any other (we convert into Claude protocol)
  • max_tokens (required for Claude)
  • stream: true
  • tools / tool_choice
  • thinking for reasoning models

Example: Python (Anthropic SDK)

from anthropic import Anthropic

client = Anthropic(
    api_key="sk-YOUR_KEY",
    base_url="https://www.ruapi.ai",
)

response = client.messages.create(
    model="claude-3-5-sonnet-20241022",
    max_tokens=1024,
    system="You are a technical expert.",
    messages=[
        {"role": "user", "content": "What is the CAP theorem?"},
    ],
)

print(response.content[0].text)
print(f"Used: input={response.usage.input_tokens}, output={response.usage.output_tokens}")

Example: Node.js (Anthropic SDK)

import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic({
  apiKey: "sk-YOUR_KEY",
  baseURL: "https://www.ruapi.ai",
});

const response = await client.messages.create({
  model: "claude-3-5-sonnet-20241022",
  max_tokens: 1024,
  messages: [
    { role: "user", content: "Explain Rust's borrow checker." },
  ],
});

console.log(response.content[0].text);

Example: curl

curl https://www.ruapi.ai/v1/messages \
  -H "x-api-key: sk-YOUR_KEY" \
  -H "anthropic-version: 2023-06-01" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-3-5-sonnet-20241022",
    "max_tokens": 1024,
    "messages": [{"role": "user", "content": "Write a haiku about code."}]
  }'
Anthropic SDK uses the x-api-key header instead of Authorization: Bearer. RuAPI accepts both on the /v1/messages endpoint.

Which protocol should I pick?

ScenarioUse
Already on OpenAI SDK / LangChain / LlamaIndexOpenAI protocol
Writing native Anthropic code, need thinking blocksClaude protocol
Want to call Claude from existing OpenAI codeOpenAI protocol (we convert)
Want to call GPT-4o from Anthropic SDKClaude protocol (we convert)
In every case — same API key, same per-token price, shared balance.

All endpoints

MethodPathPurpose
POST/v1/chat/completionsOpenAI Chat Completions
POST/v1/messagesClaude Messages
POST/v1/embeddingsEmbeddings (OpenAI compatible)
POST/v1/images/generationsImage generation (OpenAI compatible)
POST/v1/audio/transcriptionsWhisper transcription
POST/v1/audio/speechTTS — text to speech
GET/v1/modelsList of available models
The full list with current models is in your dashboard.