Skip to main content
OpenClaw is an open-source AI agent framework for building autonomous AI assistants. Connect it to IoTeX AI Gateway to access all supported models through a unified interface.

Switching to IoTeX AI Gateway?

If you already have OpenClaw running with a different model provider, just ask the agent:
Add gateway.iotex.ai as an OpenAI-compatible LLM provider and set
gemini-2.5-flash-lite as my default model. 
Docs: https://docs.iotex.ai/. 
API key is sk-xxxxxxxxxx
The agent will pick up and start from there.

New to OpenClaw?

1

Install OpenClaw

npm install -g openclaw
2

Configure IoTeX AI Gateway

openclaw config edit
Add the following to your models section:
{
  "models": {
    "providers": [
      {
        "id": "iotex",
        "name": "IoTeX AI Gateway",
        "baseURL": "https://gateway.iotex.ai/v1",
        "apiKeyEnv": "IOTEX_API_KEY",
        "type": "openai-compatible"
      }
    ],
    "aliases": {
      "gemini-flash": "iotex/gemini-2.5-flash",
      "deepseek": "iotex/deepseek-chat",
      "qwen": "iotex/qwen-2.5-14b-instruct"
    }
  }
}
3

Set Your API Key

Get your API key from the Gateway Console:
export IOTEX_API_KEY="sk-xxxxxxxxxx"
4

Start Chatting

# Use a specific IoTeX model
openclaw chat --model iotex/gemini-2.5-flash "Explain React hooks"

# Use an alias
openclaw chat --model deepseek "Write a quicksort in Python"

# Interactive session with model switching
openclaw chat -i

Advanced Configuration

Route Tasks to Different Models

// In your OpenClaw agent configuration
{
  "thinking": {
    "default": "iotex/deepseek-reasoner",
    "fast": "iotex/gemini-2.5-flash"
  },
  "vision": "iotex/gemini-2.0-flash-exp",
  "coding": "iotex/qwen-coder-turbo"
}

Programmatic Usage (Node.js)

import { OpenClaw } from 'openclaw';

const agent = new OpenClaw({
  model: 'iotex/gemini-2.5-flash',
  apiKey: process.env.IOTEX_API_KEY,
  baseURL: 'https://gateway.iotex.ai/v1'
});

const response = await agent.chat('Explain neural networks');
console.log(response);

// Streaming
await agent.chat('Write a story about AI', {
  stream: true,
  onChunk: (chunk) => process.stdout.write(chunk)
});

Model Selection Guide

TaskRecommended ModelAlias
Quick Q&Agemini-2.5-flashgemini-flash
Complex reasoningdeepseek-reasonerdeepseek
Chinese conversationsqwen-2.5-14b-instructqwen
Code generationqwen-coder-turbo-
Vision tasksgemini-2.0-flash-exp-

Troubleshooting

Make sure your API key is set correctly:
echo $IOTEX_API_KEY

# If empty, set it
export IOTEX_API_KEY="sk-xxxxxxxxxx"
Verify the model name matches the supported models list:
openclaw models list
openclaw chat --model iotex/gemini-2.5-flash "Hello"
OpenClaw automatically retries with exponential backoff. For high-volume usage, consider using faster models for simple tasks or upgrading your IoTeX plan.

Resources