Open Source · MIT · Runs on Your Machine

A personal AI that learns
your workflow

Familiar lives on your machine, trains custom models on real code from merged PRs and open-source datasets, and works autonomously while you sleep. Local-first, always learning.

Your data never leaves your machine. Your models improve for you alone.

Ollama Forge™ Hands Bun TypeScript MCP
Copied! $ brew tap engindearing-projects/tap && brew install familiar
Copied! $ npx familiar-setup

macOS · Linux · Requires git

familiar — ~/my-project
familiar v1.1 · familiar-brain:v2 Brain: 2,245 RAG chunks · Forge: 6,302 training pairs · Hands: 17 active · Mode: ask
you >
the payment flow has a race condition, can you investigate and fix it?
[search] "payment" "checkout" "race" in src/
[read] src/services/PaymentService.ts
[edit] src/services/PaymentService.ts +87 — added mutex lock around checkout
familiar
Found the race condition in PaymentService.ts. Two concurrent processCheckout() calls could debit the same balance. Added a mutex lock around the critical section. Tests pass. (session logged to Forge)
familiar agentic: cloud forge: 5,170 pairs turn 1 · 1,247 tokens
> ask anything...

Switching from cloud AI?

Familiar gives you a local AI that actually improves. Install it, point it at your repos, and it starts building models trained on real code — merged PRs, open-source datasets, and ground-truth diffs.

1

Install Familiar

One command. The setup wizard walks you through everything.

2

Mine training data

Forge mines real merged code from GitHub and open-source datasets. No cloud model outputs used.

3

Train & deploy

LoRA fine-tuning on your GPU. Models improve based on real code patterns.

The only AI that gets smarter for you

Cloud AI tools ship a frozen model. You use it, they collect your data, the model never changes for you. Familiar is different.

The Forge pipeline mines ground-truth code from merged PRs and open-source datasets, fine-tunes local models, and deploys them — automatically. The more you train, the more it understands your domain and coding patterns. Your data stays on your machine. Your models improve for you alone.

Cloud AI

  • Static models, same for everyone
  • Your data sent to third parties
  • Sycophantic — agrees with you to keep you happy
  • No memory between sessions
  • Generic, one-size-fits-all

Familiar

  • Models that learn your patterns
  • Your data never leaves your machine
  • Honest, direct, corrects you when you're wrong
  • Persistent memory + RAG knowledge base
  • Adapts to your style and codebase

How Familiar compares

Feature-by-feature breakdown against alternatives. No spin — just what each tool can and can't do.

Feature Familiar OpenClaw Cloud AI
Self-training modelsForge pipeline
Autonomy modesoff/ask/on/fulloff/ask/on/full
Multi-model routingClaude/Gemini/OllamaSingle modelSingle model
Desktop control36 MCP toolsLimited
ChannelsCLI, Telegram, iMessage, WhatsApp, Voice, MobileCLI + WebWeb only
Plugin marketplaceWith security scanningNo scanning
Sandboxed executionDocker + allowlist + auditDockerN/A
Team/multi-userRole-based (4 tiers)Shared workspacesYes
Autonomous tasks17 scheduled handsBasic agents
Knowledge graphGraph-boosted RAG
Daily learning cycle5-step nightly
Data privacy100% localLocalCloud

New in v1.1

Shipped this week. Available now via brew upgrade familiar

Mobile app new

React Native app for iOS and Android. Chat, status dashboard, team management, and autonomy control. Connects to your gateway over WebSocket.

Autonomy modes new

Four graduated levels: off, ask, on, full. 25 tools classified into safe/moderate/dangerous tiers. Set via Telegram, mobile, or gateway API.

Plugin marketplace new

Install hands and skills from the community registry. Every package is security-scanned for malware, injection, and token exfiltration before install.

Team auth new

Add team members with role-based access: owner, admin, member, viewer. Per-user tokens, tool restrictions, and team management via CLI or mobile.

Docker sandbox new

Run untrusted commands in isolated Docker containers with memory limits, no network, and read-only filesystems. Full audit logging to JSONL.

Doctor v2 new

11-section diagnostic: ports, tokens, bridges, brain health, security audit, config migration. Auto-fix with --fix.

How Familiar works

A self-improving system that runs entirely on your machine. Cloud APIs for heavy lifting, local models for everything else.

The Brain

Always-on AI companion. Remembers everything, learns from every session, develops knowledge you didn't give it.

~ RAG knowledge base (SQLite + embeddings)
~ Multi-model routing
~ Forge training pipeline
~ Hands autonomous system
~ Learnable skill modules
~ CRAAP source evaluator

Cloud Provider agentic

Configurable cloud model for heavy tasks. Reads your files, writes code, runs tests, iterates. Falls back to local models if the cloud is unavailable.

Ollama local

Local language models that run on your hardware. Handles fast chat, routing, classification, and embeddings with zero cloud dependency. Works offline.

MCP + Channels 69 tools

Rust daemon exposes 36+ MCP tools: memory, filesystem, code analysis, HTTP, and more. Connect via CLI, Telegram, Slack, or any MCP-compatible client.

All local models run via Ollama. Cloud APIs used only for agentic coding and research phases — never as training targets.

What you get

One install command. A personal AI that grows with you.

Agentic coding cloud

Cloud model reads your files, writes code, runs tests, and iterates until it's right. Configurable provider. Falls back to local models automatically.

Self-improving models free

Forge mines real merged code from GitHub and open-source datasets, then fine-tunes local models on your GPU (or CPU). Each version gets smarter based on real code patterns.

Brain & RAG free

SQLite knowledge base with semantic embeddings, knowledge graph, and CRAAP source evaluation. Familiar builds context from your code, docs, and conversations.

Hands — autonomous agents free

Scheduled tasks that run while you sleep. The researcher mines knowledge, the trainer builds models, the coder writes and tests code. Cron-driven, event-triggered, multi-phase.

36+ MCP tools free

Rust daemon with memory search, filesystem operations, code analysis, HTTP requests, system management, and more. Any MCP client connects to the full brain.

100% private free

Local models run on your machine via Ollama. Training happens locally. Your data stays on your network. Cloud APIs only used when you choose agentic mode.

Forge: the training loop

This is what makes Familiar alive. Ground-truth code feeds the pipeline. Your models develop understanding from real, reviewed code.

Mine
Merged PR diffs + open-source datasets
Prepare
Deduplicate, filter, split train/valid
Train
Fine-tune with LoRA on your GPU or CPU
Deploy
Quantize and hot-swap in Ollama
Eval
Benchmark against previous versions

Runs nightly at 2 AM, or on-demand: familiar forge status, familiar forge train, familiar forge eval

Get started in 3 steps

1

Install

One command installs Familiar, pulls Ollama models, and wires everything together. The setup wizard handles configuration.

brew tap engindearing-projects/tap && brew install familiar
2

Initialize

Run familiar init to configure your AI, API keys, channels, and services.

familiar init
3

Start chatting

Just type familiar. It's already connected to your brain, knowledge base, and models. It starts learning from day one.

familiar

Use the trained model

The Forge pipeline produces familiar-brain — a fine-tuned coding and reasoning model trained on real merged code and open-source datasets. Use it standalone, no Familiar install required.

Ollama free

Pull and run locally. Needs ~12GB RAM.

$ ollama pull familiar-run/familiar-brain
$ ollama run familiar-run/familiar-brain

Hosted API api key

OpenAI-compatible endpoint. Works with any SDK.

curl https://api.familiar.run/v1/chat/completions \ -H "Authorization: Bearer YOUR_KEY" \ -d '{"model":"familiar-brain","messages":[...]}'

OpenCode config

Drop this into .opencode.json in any project:

{ "model": "familiar/familiar-brain", "provider": { "familiar": { "npm": "@ai-sdk/openai-compatible", "options": { "baseURL": "https://api.familiar.run/v1", "apiKey": "YOUR_KEY" }, "models": { "familiar-brain": { "name": "Familiar Brain", "limit": { "context": 8192, "output": 4096 } } } } } }

GGUF weights also available on HuggingFace for llama.cpp and other runtimes.

Built from scratch. No LangChain, no AutoGen, no third-party agent frameworks. The training pipeline, RAG system, autonomous hands, skill modules, MCP bridge, and model serving layer are all original code by engindearing. We use Ollama for local inference and configurable cloud providers for heavy tasks. We focus on making your AI smarter.