Benchmarked against: Anthropic — MCP connector (local)
Deployment: stdio MCP server (Python)
Location: Per-ship — runs locally on SS1, SS2
Codebase: ub/ directory
The Local UBI (Universal Brain Infrastructure) MCP server is the ship-local powerhouse. It runs as a Python stdio process alongside the AI agent, providing access to the local filesystem, dispatch engine, intelligence operations, and a local mirror of UB data. While Cloud UB handles shared fleet data, Local UBI handles everything that requires local system access.
Why local?
| Capability | Why it needs local access |
|---|
| File operations | Read/write/edit files on the ship's filesystem |
| Code search | Grep through local codebases |
| Shell commands | Execute sandboxed commands on the ship |
| Dispatch engine | Spawn background processes for WO execution |
| MTAAA pipeline | Run the 5-node classification pipeline (requires local Python + models) |
| Intel operations | Call external APIs (Groq, Gemini, DeepSeek) for research |
| Local UB mirror | Fast local search when Cloud UB is unnecessary |
Architecture
Components
| Component | Technology | Purpose |
|---|
| MCP Server | Python stdio | Tool registration and request handling |
| Local SQLite | SQLite + FTS5 | Local UB data (mirror/staging) |
| Local Qdrant | Qdrant vector DB | Local semantic search |
| MTAAA Pipeline | LangGraph (Python) | Content classification pipeline |
| Dispatch Worker | Python subprocess | Background WO execution |
| LLM Clients | Groq/Gemini/DeepSeek/Mistral/Zhipu SDKs | Low-cost model access |
Knowledge management
| Tool | Description |
|---|
search_brain | Hybrid search (local Qdrant + FTS5) |
search_by_category | Browse entries by category |
ingest_fragment | Full MTAAA 5-node pipeline ingestion |
get_entry | Retrieve entry by ID |
get_recent | List recent entries |
get_stats | UB statistics |
Work order management
| Tool | Description |
|---|
create_work_order | Create new WO (writes to Cloud UB) |
list_work_orders_tool | Query WOs with filters |
get_work_order_detail | Full WO details + transition history |
accept_work_order | Accept a pending WO |
update_work_order_status | Status transition |
complete_work_order | Submit for review with summary |
verify_work_order | Captain approve/reject |
get_work_order_form | Get blank WO form template |
Dispatch engine
| Tool | Description |
|---|
dispatch_work_order | Trigger WO execution via selected engine |
check_dispatch | Check dispatch center for pending WOs and results |
Dispatch engines available:
| Engine | Cost | Best for |
|---|
ingest | Free | Batch file ingestion into UB |
groq | Free | Research, analysis, simple tasks |
groq-search | Free | Web search with Groq Compound |
gemini | ~$0.014 | General tasks, quality-sensitive |
gemini-search | ~$0.014 | Authoritative search with citations |
deepseek | Cents | Analysis, reasoning tasks |
mistral | Cents | European model, alternative |
zhipu | Cents | Chinese NLP, agent tool-calling |
claude | ~$1–2/run | Code editing, file operations |
Intelligence operations
| Tool | Description |
|---|
intel_search | Search any topic, auto-ingest findings to UB |
search_web | Web search via Groq/Gemini engines |
run_patrol | Trigger Intelligence Officer patrol |
list_patrol_domains | List configured patrol domains |
call_model | Call low-cost LLM for subtasks |
list_models | List available LLM providers and models |
Agent communication
| Tool | Description |
|---|
send_agent_message | Send message to agent mailbox |
check_agent_mailbox | Check mailbox |
mark_message | Toggle read/unread |
archive_message | Soft-delete message |
agent_heartbeat | Register agent as online |
File operations
| Tool | Description |
|---|
ubi_read_file | Read file with line numbers |
ubi_write_file | Write file (auto .bak backup) |
ubi_edit_file | Exact string replacement |
ubi_list_directory | List directory contents |
ubi_run_command | Sandboxed shell execution |
ubi_search_code | Grep/rg code search |
Fleet operations
| Tool | Description |
|---|
factory_floor_status | Who's doing what — real-time overview |
update_captain_location | Update Captain's location |
MTAAA integration
The Local UBI is the only server that runs the full MTAAA pipeline. When ingest_fragment is called:
The pipeline:
- File Detector — identifies input type (text/code/config/media/screenshot)
- Content Extractor — extracts content and metadata
- Feature Learner — identifies key entities, topics, themes
- Schema Matcher — maps to 3D classification (Topic × Type × Lifecycle) using Controlled Vocabulary
- Archivist — writes to UB with proper tags and metadata
Cloud UB's ingest_fragment does a simpler direct ingestion without the full pipeline.
Environment variables
| Variable | Purpose | Example |
|---|
SP_SHIP_ID | Ship identifier | SS1 |
SP_AGENT_ID | Agent identity | Mac CLI 小克 |
GROQ_API_KEY | Groq API access | gsk_... |
GEMINI_API_KEY | Gemini API access | AIza... |
DEEPSEEK_API_KEY | DeepSeek API access | sk-... |
MISTRAL_API_KEY | Mistral API access | ... |
ZHIPU_API_KEY | Zhipu GLM access | ... |
CLOUD_UB_URL | Cloud UB endpoint | https://cloud-ub.superportia.workers.dev |
CLOUD_UB_TOKEN | Cloud UB auth token | ... |
Local vs Cloud UB — when to use which
| Scenario | Use |
|---|
| Cross-ship data (WOs, messages, shared knowledge) | Cloud UB |
| File operations on this ship | Local UBI |
| MTAAA pipeline ingestion | Local UBI |
| Dispatch engine execution | Local UBI |
| Intel search / web search | Local UBI (has API keys) |
| Quick knowledge search | Either (local is faster for cached data) |
| Agent heartbeat / floor status | Either (both proxy to Cloud) |
Related pages