MIKA5 MIKA5 v0.1.0
v0.1.0 — Windows · macOS · Linux

AI on your machine.
Under your control.

The only local AI desktop with built-in RAG, project organization, and multi-provider support — in English and Spanish. No subscriptions. No hidden cloud.

Free · Offline-ready · Windows · macOS · Linux

MIKA5 — Proyecto: Investigación
Proyectos
Investigación
Código Personal
Documentos
Chats Rápidos
Sin proyecto
Analiza el documento y extrae los puntos clave
RAG: 3 fragmentos relevantes encontrados
Basándome en tu base de conocimiento, los puntos clave son:
1. Arquitectura local-first con SQLite
2. RAG con embeddings Ollama...
Escribe tu mensaje...
Offline-ready No telemetry SQLite local memory Ollama compatible Vision models Built-in RAG engine Cloud optional

Everything in one place.

No extensions. No integrations. No API keys required to get started.

Built-in RAG Engine

Upload documents and MIKA5 automatically chunks, embeds, and retrieves the most relevant fragments for every message. Per-project and per-chat knowledge bases.

Projects & Chats

Organize work into projects with isolated knowledge, system prompts, and chat history. Quick Chat for ad-hoc conversations without a project.

Vision Models

Attach images to any message. MIKA5 auto-detects vision-capable models (LLaVA, GLM-OCR, Gemma3, Qwen2.5-VL, GPT-4o, Claude) and warns when a model can't process images.

Cloud AI — Optional

Connect OpenAI, Anthropic, Groq, or Moonshot when you need them. API keys stored with AES-256-GCM encryption. A privacy indicator shows when cloud is active.

Model Tracking

Every message is tagged with the model that generated it. Model switches within a chat are highlighted inline so you always know which AI answered.

Export Anywhere

Export individual messages or full conversations to Markdown, plain text, HTML, Python, or PDF. Smart titles generated automatically from content.

Popular & everyday — local via Ollama

llama3.2:3b qwen2.5:7b qwen2.5-vl:7b deepseek-r1:7b mistral:7b gemma3:9b phi4:14b glm-ocr:latest llava:13b nomic-embed-text

High-end — require 32–64 GB RAM / GPU 24 GB+

qwen3.5:72b glm-5:cloud kimi-k2:cloud nemotron-3-super:cloud deepseek-r1:70b llama3.3:70b

⚡ High-end models require significant hardware. Performance varies by PC specs. See Model Guide for detailed hardware requirements.

Local runtimes: Ollama · llama.cpp — Cloud providers: OpenAI · Anthropic · Groq · Moonshot

The cloud isn't neutral.

Subscription lock-in

Pricing and access can change overnight without your consent.

Token billing

You pay per thought. Creativity shouldn't have a meter running.

Data exposure

Prompts pass through servers you don't own or control.

Cloud AI flow

You → Browser → Vendor Cloud → Policies → Model → Response

# Your prompt stored, processed, potentially used for training

MIKA5 flow

You → MIKA5 Desktop → Local Ollama → Done

# Zero network calls. Your hardware. Your data.


Download MIKA5

Free. No account. No license key.

Windows

Windows 10 / 11 · x64

v0.1.0

macOS

Apple Silicon · Intel x64

v0.1.0

Linux

x64 · AppImage · deb · rpm

v0.1.0

macOS: The app is not notarized. On first launch, right-click the .dmg → Open to bypass Gatekeeper.

# Verify download integrity

PS> Get-FileHash "MIKA5-Setup-0.1.0-win-x64.exe" -Algorithm SHA256 # Windows

$ shasum -a 256 MIKA5-0.1.0-mac-arm64.dmg # macOS

$ sha256sum MIKA5-0.1.0-linux-x64.AppImage # Linux

# Compare with the SHA256 published in the GitHub release notes

All releases → github.com/mika5app/mika5/releases


Up and running in 3 minutes.

MIKA5 requires Ollama running locally. Install Ollama first, then follow these steps.

1

Install Ollama

Download from ollama.com and install it. It runs a local server at 127.0.0.1:11434.

2

Pull a model + the embedding model

$ ollama pull llama3.2

$ ollama pull nomic-embed-text

# nomic-embed-text powers the RAG engine

3

Install and launch MIKA5

Run the MIKA5-Setup-0.1.0-win-x64.exe installer, then open MIKA5 from the Start Menu or Desktop shortcut.

System Requirements

  • Windows 10 or 11 (64-bit)
  • 8 GB RAM minimum (16 GB recommended)
  • 4 GB free disk (+ model files)
  • Ollama installed and running
  • GPU optional — CPU inference supported

Recommended starter models

llama3.2:3bfast · 2 GB
qwen2.5:7bbalanced · 4.7 GB
qwen2.5-vl:7bvision · 5.5 GB
nomic-embed-textrequired for RAG

Your data never leaves your machine.

No telemetry. No background sync. No account creation. No hidden calls home. All conversations and knowledge are stored in a local SQLite database on your computer.

Zero telemetry by default No analytics or crash reporting Cloud providers optional and user-configured API keys encrypted with AES-256-GCM

Run AI on your terms.

v0.1.0 · Windows · macOS · Linux · Free & Open Source

MIKA5 Pro is coming

Team collaboration, cloud sync, advanced RAG, and more. Get notified when it launches.

No spam. Unsubscribe anytime.