AgentConn
O

openhuman (tinyhumansai)

Coding Free

About openhuman (tinyhumansai)

openhuman/tinyhumansai is a single Rust binary you drop on your machine and run as a personal AI runtime — model orchestration, memory, and a chat shell, all local, all under one process. The framing the project leads with is 'personal AI super-intelligence': not an enterprise agent, not a cloud control plane, but a private/simple shell that one person owns end-to-end. It launched on May 12, 2026 and hit GitHub trending the same day at +1,042 stars in the first 24 hours. Architecturally, it sits in the same shell-layer slot that Anthropic is now contesting with Cowork and the Claude Code Agent View research preview — but instead of routing through hosted Anthropic infrastructure, openhuman wires together local model backends (llama.cpp / mlx / candle), an on-disk memory store, and a TUI chat surface in one statically-linked Rust executable. The bet is that the same buyer who could pick the Anthropic-shell stack will instead pick a private/simple binary they fully control — the same trade-off that drove the early local-LLM wave, now applied to the multi-agent shell layer. It pairs naturally with cc-switch (provider switching), agentmemory (persistent memory substrate), and AionUi (multi-CLI chat shell) as the open-stack alternative to a hosted Anthropic-dispatched workflow.

Key Features

  • Single statically-linked Rust binary — drop on disk and run, no install pipeline
  • Local-first architecture — models, memory, and shell run on-device, no cloud calls required
  • 'Personal AI super-intelligence' framing — built for one user, not for tenant isolation
  • Pluggable model backends — llama.cpp, mlx, candle for Apple Silicon, x86, and ARM hosts
  • On-disk memory store — persists context across sessions without a hosted vector DB
  • TUI chat shell — keyboard-first interaction, no Electron, no browser dependency
  • Open-stack composable — pairs with cc-switch, agentmemory, AionUi as the open alternative to Cowork
  • Privacy-by-default — no telemetry, no remote logging, no SaaS account

Overview

openhuman/tinyhumansai is a Rust-binary local-AI shell that hit GitHub trending on its launch day (May 12, 2026) with +1,042 stars in the first 24 hours. The pitch is unusually direct: a single executable that is your personal AI runtime, framed as “personal AI super-intelligence.” No login, no cloud, no SaaS. You download the binary, run it, and you have a chat shell, persistent memory, and a model orchestration layer — all in one process under your full control.

Why It Matters Right Now

The week openhuman launched is the same week Anthropic moved decisively into the agent-shell layer with Cowork (the flight-booking demo on Opus 4.7 that, per bcherny, “didn’t fall over”) and the Claude Code Agent View research preview. The agent stack is now being contested at two layers simultaneously: the engine layer (Claude Code, Codex, DeepSeek-TUI) and the shell layer (Cowork on the hosted side; cc-switch, AionUi, openhuman, and the assemble-your-own stack on the open side).

openhuman is the most opinionated open-stack answer to the hosted-shell question. Where AionUi and cc-switch multiplex existing AI CLIs, openhuman ships its own runtime end-to-end as one binary. Where Cowork orchestrates multiple Anthropic agents through a hosted dispatcher, openhuman runs a single local agent under one user account on one machine. The buyer trade-off is identical to the one that drove the first local-LLM wave: you trade peak capability for ownership.

How It Composes With the Open Stack

openhuman is not an island. The natural pairing is:

  • cc-switch for provider switching when you want to also call hosted models from the same shell
  • agentmemory as a more sophisticated memory substrate if openhuman’s built-in store is too lightweight
  • AionUi if you want to add other CLIs (Claude Code, Codex, OpenClaw) into the same chat surface

This composability is the open-stack pattern: pick the layers you want, swap any one of them, never get locked into a single vendor’s hosted shell.

Who It’s For

Developers and power users who want a local-first agent runtime with no SaaS dependency, no hosted control plane, and no per-seat licensing — and who value a single Rust binary over a docker-compose stack of microservices. Privacy-sensitive operators (legal, medical, financial) who cannot route prompts through hosted inference. Builders who want to experiment with a fully open multi-agent shell as the architectural alternative to Anthropic’s hosted Cowork model.

For the broader stack-level picture — how Cowork’s launch is forcing the open shell layer to consolidate, and where openhuman fits — see Cowork Just One-Shotted a Flight Booking on Opus 4.7 — Anthropic’s Shell-Layer Move and What It Means for the Agent Stack.

Similar Agents