Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.ruapi.ai/llms.txt

Use this file to discover all available pages before exploring further.

Codex CLI is OpenAI’s official command-line coding assistant, analogous to Claude Code. It speaks the OpenAI-compatible protocol, so OPENAI_BASE_URL + OPENAI_API_KEY are all you need to route it through RuAPI.

Prerequisites

1

RuAPI account + API key

Console → TokensCreate token. See API keys & security.
2

Node.js ≥ 18

node --version
Install from nodejs.org if needed.

Install Codex CLI

npm install -g @openai/codex
Confirm:
codex --version

Configure RuAPI

Codex supports two ways to configure the endpoint: env vars or ~/.codex/config.toml. The TOML file is the longer-lived path. Edit ~/.codex/config.toml (or %USERPROFILE%\.codex\config.toml on Windows). Create it if missing:
model_provider = "ruapi"
model = "gpt-5"

[model_providers.ruapi]
name = "RuAPI"
base_url = "https://www.ruapi.ai/v1"
wire_api = "chat"
env_key = "RUAPI_API_KEY"
Then export the key once (or persist it to ~/.zshrc / ~/.bashrc):
export RUAPI_API_KEY="sk-your-key"
wire_api = "chat" means classic /v1/chat/completions — the most reliable choice. You can try "responses" for the newer Responses API, but not every model on RuAPI supports it yet.

Method 2: env vars only

If you’d rather skip the config file:
export OPENAI_BASE_URL="https://www.ruapi.ai/v1"
export OPENAI_API_KEY="sk-your-key"
codex
In the OpenAI-compatible protocol, base_url / OPENAI_BASE_URL must end in /v1. That’s the opposite of Anthropic’s convention.

First call

cd ~/some-project
codex
In the interactive prompt:
> In one sentence, introduce yourself.
A normal reply plus a matching entry in RuAPI Console → Logs = success.

Switching models

Inside a session:
/model gpt-5
/model gpt-5-mini
/model o4-mini
Or change the default by editing model = "..." at the top of config.toml. See the model gallery for the full list.

Troubleshooting

  • OPENAI_API_KEY (or RUAPI_API_KEY) must be your RuAPI token (sk-...), not OpenAI’s sk-proj-....
  • Confirm the token is enabled and the model is in its allowlist via Console → Tokens.
  • Typo. Model IDs on RuAPI match OpenAI’s official names: gpt-5, gpt-5-mini, o4-mini, gpt-4.1, etc.
  • OPENAI_BASE_URL must end in /v1.
Set wire_api = "chat" in config.toml. The "responses" protocol is only supported by some newer models.
RuAPI passes tool / function calls through unchanged. If Codex still errors, try the same prompt against OpenAI’s official endpoint first to rule out a prompt issue.
Define two providers in config.toml and use /provider <name> to switch inside a session, or install CC Switch.

Advanced: running alongside Claude Code

A common setup is Codex (for OpenAI-strong workloads) plus Claude Code (for long-context / code reasoning). They don’t interfere — configure each per Claude Code and this page. The same RuAPI key works for both.