Join Nostr
2026-02-02 10:43:27 UTC

captjack on Nostr: openclaw model providers #aiagent with a webui -> that can connect to LOCAL -WEBAIAPI ...

openclaw model providers #aiagent with a webui -> that can connect to LOCAL -WEBAIAPI - REMOTE-PAYUSE-MODELS

SAMPLE syntax - openclaw.json

{
agents: {
defaults: { model: { primary: "moonshot/kimi-k2.5" } },
},
models: {
mode: "merge",
providers: {
moonshot: {
baseUrl: "https://api.moonshot.ai/v1";,
apiKey: "${MOONSHOT_API_KEY}",
api: "openai-completions",
models: [{ id: "kimi-k2.5", name: "Kimi K2.5" }],
},
},
},
}


LOCAL
Ollama
Ollama is a local LLM runtime that provides an OpenAI-compatible API:

Provider: ollama
Auth: None required (local server)
Example model: ollama/llama3.3
Installation: https://ollama.ai

# Install Ollama, then pull a model:
ollama pull llama3.3

{
agents: {
defaults: { model: { primary: "ollama/llama3.3" } },
},
}
OPENROUTER