Picoclaw Tutorial: A Hyper-Minimalist Agent Framework for Limited Disk Space
By Braincuber Team
Published on April 23, 2026
Picoclaw is a lightweight personal AI assistant written in Go that runs on edge devices such as Raspberry Pi, embedded RISC-V boards, and MIPS-based routers. The full AI agent fits in under 10MB and launches in less than one second on a 600MHz processor. This beginner guide walks you through what Picoclaw is, how to set it up, and its key capabilities.
What You'll Learn:
- What Picoclaw is and its key features
- How to install Picoclaw via Docker or build from source
- Configure LLM providers (OpenAI, Anthropic, DeepSeek, Ollama)
- Set up Telegram integration for mobile access
- Picoclaw capabilities: heartbeat, persistent memory, offline mode
- Troubleshooting common setup problems
What Is Picoclaw?
Picoclaw is a lightweight personal AI assistant inspired by Nanobot. It is written in Go by the Sipeed team and runs on edge devices such as Raspberry Pi, embedded RISC-V boards, and MIPS-based routers. This means that you can run Picoclaw with a hardware budget of $10.
Picoclaw Key Features and Capabilities
Single Binary File
Ships as a single binary. No Python virtual environments or pip install required. Extremely fast setup.
Cross-Platform
Runs on Linux, macOS, Windows, and various architectures including ARM and RISC-V.
Provider-Agnostic
Supports OpenAI, Anthropic, DeepSeek, and Ollama for local models. Switch between providers easily.
Messaging Platforms
Supports WhatsApp, Telegram, Discord, QQ, and DingTalk for easy interaction without switching apps.
Built-in Tools
Read/write files, run shell commands, search web with Brave/DuckDuckGo, and schedule recurring jobs with cron.
Under 10MB
Fits in extremely limited disk space. Runs on $10 hardware budget including Raspberry Pi Zero.
| Pros | Cons |
|---|---|
| Single binary, easy setup on any device | Still pre-1.0, may have bugs or breaking changes |
| Works with existing messaging platforms | Crypto scams targeting Picoclaw in the wild |
| Supports multiple LLM providers | Network security issues in early development |
| Runs on $10 hardware (Raspberry Pi) | Not recommended for production before v1.0 |
How to Set Up Picoclaw
You can set up Picoclaw in a few steps. The recommended approach is using Docker because it is much faster than building from source.
Installation (Docker Method)
# 1. Clone this repo
git clone https://github.com/sipeed/picoclaw.git
cd picoclaw
# 2. First run - auto-generates docker/data/config.json then exits
docker compose -f docker/docker-compose.yml --profile gateway up
# The container prints "First-run setup complete." and stops.
# 3. Set your API keys
nano docker/data/config.json
# 4. Start
docker compose -f docker/docker-compose.yml --profile gateway up -d
Alternative: Build from Source
git clone https://github.com/sipeed/picoclaw.git
cd picoclaw
make deps
# Build, no need to install
make build
# Build for Raspberry Pi Zero 2 W
make build-pi-zero
# Build And Install
make install
Configure Your LLM Provider
After setting up, you need to configure the LLM provider of your choice. You can do this by editing the docker/data/config.json file:
Edit Config File
Open docker/data/config.json and set the default model name, max tokens, temperature, and tool iterations.
Add Model List
Add entries to model_list with model name, provider path, and API key for each provider you want to use.
Configure Web Search Tools
Enable Brave, Tavily, DuckDuckGo, Perplexity, or SearXNG for web search capabilities. Set API keys and max results per query.
Running Your First Agent with Telegram
Create Telegram Bot
Open Telegram and search @BotFather, type /newbot and follow the prompts to create a new bot.
Get Your User ID
Search for @userinfobot in Telegram, click start, and copy your user ID to ensure the bot only accepts commands from you.
Update Config File
Update docker/data/config.json to include the bot token and your Telegram user ID.
Start Interacting
Head over to your Telegram bot and start interacting with Picoclaw. Try web search, drafting blogs, or setting reminders.
Picoclaw Common Problems and Troubleshooting
No Response on Telegram
Check that the Telegram token is correct, the from user ID is set, and model/API keys are configured. Check logs with docker compose -f docker/docker-compose.yml logs -f picoclaw-gateway.
Model Not Set
Error "model not found in model_list" means you have not set the default model in config. Open the JSON config and set model_name in defaults section.
Incorrect API Key
The API key for your chosen model provider is empty or wrong. Open the JSON config file and verify the key is correctly entered.
Picoclaw's Standout Capabilities
Heartbeat System
A heartbeat markdown file in the working directory is read every 30 mins. Picoclaw can spin up subagents to handle work in parallel.
Persistent Memory
Memory file stores important information across sessions. Saves preferences and enabled skills for persistent context.
Air-Gapped / Offline Mode
Can run without internet access using local models via Ollama. Ideal for sensitive data in regulated industries.
Multi-Agent & Fallbacks
Define each agent to use its own LLM provider. Supports load balancing with round robin model rotation.
Hardware Integration
Built-in tools for I2C and SPI communication. Interact with sensors, displays, motor controllers, and physical devices.
Secure by Default
Restricted to its workspace directory. Commands like rm are blocked. Guardrails prevent destructive actions.
Picoclaw in Action: Real Use Cases
| Use Case | Description |
|---|---|
| Home Automation | Running home automation and monitoring with a first-generation Raspberry Pi |
| Automated Web Search | Send daily summaries on Telegram from automated web searches |
| Alerting & Monitoring | Sensor data alerts from embedded devices in remote locations |
Production Readiness
As of March 2026, Picoclaw is still pre-version 1.0 and not recommended for production. It has unresolved network security issues. For a more mature alternative, consider OpenClaw.
Future Outlook
The ability to run large language models on edge devices has gained a lot of traction recently. Picoclaw will be very useful in resource-constrained environments and applications where data sensitivity is absolute. In places where connectivity is unreliable, there will be no other option but to use a tool like Picoclaw.
Picoclaw could be a real competitor to Nanobot and OpenClaw when it is ready for production. The roadmap includes browser automation, more skills discovery, and MCP support.
Frequently Asked Questions
Who made Picoclaw?
Picoclaw was built by the Sipeed team. Sipeed is known for making affordable hardware, especially RISC-V and ARM-based boards.
Is Picoclaw free to use?
Yes. Picoclaw itself is free and open-source. If you use a local model, then it is truly free.
Is Picoclaw production-ready?
Not yet. As of March 2026, Picoclaw is still pre-version 1.0. It is not recommended for production deployment.
What hardware can Picoclaw run on?
Raspberry Pi, embedded RISC-V boards, MIPS-based routers, and even phones. It runs on a $10 hardware budget.
Which LLM providers does Picoclaw support?
OpenAI, Anthropic, DeepSeek, and Ollama for local models. You can switch between providers or use multiple with load balancing.
Need Help with AI Implementation?
Our experts can help you implement AI agents, edge device deployments, and custom automation solutions for your business.
