✓ Verified 💻 Development ✓ Enhanced Data

Coala

How to use the coala-client CLI for chat with LLMs, MCP servers, and skills.

Rating
4.3 (420 reviews)
Downloads
13,645 downloads
Version
1.0.0

Overview

How to use the coala-client CLI for chat with LLMs, MCP servers, and skills.

Complete Documentation

View Source →

Coala Client

Part of the coala ecosystem. CLI for chat with OpenAI-compatible LLMs (OpenAI, Gemini, Ollama) and MCP (Model Context Protocol) servers. Supports importing CWL toolsets as MCP servers, importing skills.

Config paths

  • MCP config and toolsets: ~/.config/coala/mcps/
  • mcp_servers.json — server definitions
  • / — per-toolset dirs with run_mcp.py and CWL files
  • Skills: ~/.config/coala/skills/ (one subfolder per imported source)
  • Env: ~/.config/coala/env (optional; key=value for providers and MCP env)

Quick start

  • Init (first time)
coala init — creates ~/.config/coala/mcps/mcp_servers.json and env.
  • Set API key
e.g. export OPENAI_API_KEY=... or export GEMINI_API_KEY=.... Ollama needs no key.
  • Chat
coala or coala chat — interactive chat with MCP tools. coala ask "question" — single prompt with MCP.
  • Options
-p, --provider (openai|gemini|ollama|custom), -m, --model, --no-mcp.

MCP: CWL toolsets

No API key needed for MCP import, list, or call — only for chat/ask with an LLM.

  • Import (creates toolset under ~/.config/coala/mcps// and registers server):
coala mcp-import or alias coala mcp ... SOURCES: local .cwl files, a .zip, or http(s) URLs to a .cwl or .zip. Requires the coala package where the MCP server runs (for run_mcp.py).
  • List
coala mcp-list — list server names. coala mcp-list — print each tool’s schema (name, description, inputSchema).
  • Call
coala mcp-call . --args '' Example: coala mcp-call gene-variant.ncbi_datasets_gene --args '{"data": [{"gene": "TP53", "taxon": "human"}]}'

Skills

  • Import (into ~/.config/coala/skills/, one subfolder per source):
coala skill SOURCES: GitHub tree URL (e.g. https://github.com/owner/repo/tree/main/skills), zip URL, or local zip/dir.
  • In chat
/skill — list installed skills. /skill — load skill from ~/.config/coala/skills// (e.g. SKILL.md) into context.

Chat commands

  • /help, /exit, /quit, /clear
  • /tools — list MCP tools
  • /servers — list connected MCP servers
  • /skill — list skills; /skill — load a skill
  • /model — show model info
  • /switch — switch provider

MCP on/off

  • All off: coala --no-mcp (or coala ask "..." --no-mcp).
  • One server off: remove its entry from ~/.config/coala/mcps/mcp_servers.json.
  • On: default when --no-mcp is not used; add or restore servers in mcp_servers.json.

Providers and env

Set provider via -p or env PROVIDER. Set keys and URLs per provider (e.g. OPENAI_API_KEY, GEMINI_API_KEY, OLLAMA_BASE_URL). Optional: put vars in ~/.config/coala/env. coala config — print current config paths and provider/model info.

Installation

Terminal bash

openclaw install coala
    
Copied!

Tags

#devops_and-cloud #cli

Quick Info

Category Development
Model Claude 3.5
Complexity One-Click
Author hubentu
Last Updated 3/10/2026
🚀
Optimized for
Claude 3.5
🧠

Ready to Install?

Get started with this skill in seconds

openclaw install coala