✓ Verified
🌐 Web Scrapers
✓ Enhanced Data
Ragie Rag
Execute Retrieval-Augmented Generation (RAG) using Ragie.ai.
- Rating
- 4.7 (309 reviews)
- Downloads
- 1,430 downloads
- Version
- 1.0.0
Overview
Execute Retrieval-Augmented Generation (RAG) using Ragie.ai.
Complete Documentation
View Source →
Ragie.ai RAG Skill (OpenClaw Optimized)
This skill enables grounded question answering using Ragie.ai as a RAG backend.
Ragie handles:
- Document chunking
- Embedding
- Vector indexing
- Retrieval
- Optional reranking
- Deciding when to ingest
- Triggering retrieval
- Constructing grounded prompts
- Producing final answers
Core Principles
- Never answer without retrieval.
- Never hallucinate information not present in retrieved chunks.
- Always cite the
document_namewhen referencing specific facts. - If retrieval returns zero relevant chunks, explicitly say:
- Do not expose API keys or raw API payloads in final answers.
Deterministic Workflow
Case A — User Provides a File or URL
IF the user provides:
- A file
- A document path
- A PDF/URL to ingest
- Execute ingestion:
bash
python `skills/scripts/ingest.py` --file <path> --name "<document_name>"
bash
python `skills/scripts/ingest.py` --url "<url>" --name "<document_name>"
- Capture returned
document_id. - Poll document status:
bash
python `skills/scripts/manage.py` status --id <document_id>
ready.
- Proceed to Retrieval (Case C).
Case B — User Requests Document Management
List documents
bash
python `skills/scripts/manage.py` list
Check document status
bash
python `skills/scripts/manage.py` status --id <document_id>
Delete a document
bash
python `skills/scripts/manage.py` delete --id <document_id>
Return structured results to the user.
Case C — Retrieval (Grounded Question Answering)
Execute:
bash
python `skills/scripts/retrieve.py` \
--query "<user_question>" \
--top-k 6 \
--rerank
Optional flags:
--partition--filter '{"key":"value"}'
Retrieval Output Format
Expected output:
json
[
{
"text": "...",
"score": 0.87,
"document_name": "Policy Handbook",
"document_id": "doc_abc123"
}
]
Grounded Prompt Construction
After retrieval:
- Extract all chunk
text. - Concatenate with separators.
- Construct this prompt:
text
SYSTEM:
You are a helpful assistant.
Answer using ONLY the context provided below.
If the context does not contain the answer, say:
"I don't have that information in the current knowledge base."
CONTEXT:
[chunk 1 text]
---
[chunk 2 text]
---
...
USER QUESTION:
{original user question}
- Generate final answer.
- Cite
document_namewhen referencing information.
Output Contract
The final response MUST:
- Be grounded only in retrieved chunks
- Cite
document_namefor factual claims - Avoid hallucinations
- Avoid mentioning internal execution steps
- Avoid exposing API keys or raw responses
- Clearly state when information is missing
text
I don't have that information in the current knowledge base.
API Reference
Base URL:
text
https://api.ragie.ai
| Operation | Method | Endpoint |
|---|---|---|
| Ingest file | POST | /documents |
| Ingest URL | POST | /documents/url |
| Retrieve chunks | POST | /retrievals |
| List documents | GET | /documents |
| Get document | GET | /documents/{id} |
| Delete document | DELETE | /documents/{id} |
Error Handling
| HTTP Code | Meaning | Action |
|---|---|---|
| 404 | Document not found | Verify document_id |
| 422 | Invalid payload | Validate request schema |
| 429 | Rate limited | Retry with backoff |
| 5xx | Server error | Retry or check Ragie status |
- Report failure clearly.
- Do not proceed to retrieval.
- Retry once.
- If still failing, inform user.
Decision Rules Summary
- If user uploads content → ingest → wait until ready → retrieve.
- If user asks question → retrieve immediately.
- If zero chunks → state knowledge gap.
- Always use reranking unless explicitly disabled.
- Never answer without retrieval.
Advanced Usage
- Use metadata
filterto narrow retrieval scope. - Use partitions to separate tenant data.
- Use
recency_biasonly when time relevance matters. - Adjust
top_kdepending on query complexity.
Security
- API keys must be loaded from environment variables.
.envmust not be committed.- Do not log sensitive headers.
Summary
This skill provides:
- Deterministic ingestion
- Deterministic retrieval
- Strict grounded answering
- Complete Ragie lifecycle management
- Safe and hallucination-resistant RAG execution
Installation
Terminal bash
openclaw install ragie-rag
Copied!
💻Code Examples
--rerank
---rerank.txt
Optional flags:
- `--partition <name>`
- `--filter '{"key":"value"}'`
---
# Retrieval Output Format
Expected output:]
.txt
---
# Grounded Prompt Construction
After retrieval:
1. Extract all chunk `text`.
2. Concatenate with separators.
3. Construct this prompt:{original user question}
original-user-question.txt
4. Generate final answer.
5. Cite `document_name` when referencing information.
---
# Output Contract
The final response MUST:
- Be grounded only in retrieved chunks
- Cite `document_name` for factual claims
- Avoid hallucinations
- Avoid mentioning internal execution steps
- Avoid exposing API keys or raw responses
- Clearly state when information is missing
If no chunks are returned:I don't have that information in the current knowledge base.
i-dont-have-that-information-in-the-current-knowledge-base.txt
---
# API Reference
Base URL:example.sh
python `skills/scripts/retrieve.py` \
--query "<user_question>" \
--top-k 6 \
--rerankexample.json
[
{
"text": "...",
"score": 0.87,
"document_name": "Policy Handbook",
"document_id": "doc_abc123"
}
]example.txt
SYSTEM:
You are a helpful assistant.
Answer using ONLY the context provided below.
If the context does not contain the answer, say:
"I don't have that information in the current knowledge base."
CONTEXT:
[chunk 1 text]
---
[chunk 2 text]
---
...
USER QUESTION:
{original user question}Tags
#search_and-research
Quick Info
Category Web Scrapers
Model Claude 3.5
Complexity One-Click
Author hatim-be
Last Updated 3/10/2026
🚀
Optimized for
Claude 3.5
Ready to Install?
Get started with this skill in seconds
openclaw install ragie-rag
Related Skills
✓ Verified
💻 Development
4claw
4claw — a moderated imageboard for AI agents.
🧠 Claude-Ready
)}
★ 4.4 (118)
↓ 4,990
v1.0.0
✓ Verified
💻 Development
Aap Passport
Agent Attestation Protocol - The Reverse Turing Test.
🧠 Claude-Ready
)}
★ 4.3 (89)
↓ 4,621
v1.0.0
✓ Verified
💻 Development
Adaptive Suite
A continuously adaptive skill suite that empowers Clawdbot.
🧠 Claude-Ready
)}
★ 4.7 (88)
↓ 1,625
v1.0.0
✓ Verified
💻 Development
Adversarial Prompting
Adversarial analysis to critique, fix.
🧠 Claude-Ready
)}
★ 4.6 (372)
↓ 28,222
v1.0.0