Skip to content

quiet-node/thuki

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

144 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Thuki - WIP

Thuki logo

Thuki  - Floating AI for macOS. Free & Local. No cloud, no API keys. | Product Hunt

A floating AI secretary for macOS. Fully local, completely free, zero data ever leaves your machine.

Beta License CI Platform: macOS

Tauri v2 React 19 TypeScript Rust Tailwind CSS 4 SQLite Ollama


No API keys. No subscriptions. No cloud. No telemetry. Free forever.

Thuki (thΖ° kΓ­ - Vietnamese for secretary) is a lightweight macOS overlay powered by local AI models running entirely on your own machine, built for quick, uninterrupted asks without ever leaving what you're doing.

See It in Action

Basic Usage

Double-tap Control βŒƒ to summon Thuki from anywhere. Ask a question, get an answer, and dismiss. Use /screen or the screenshot button to capture your screen and attach it as context.

thuki_demo_1.mp4

Overlay Mode

Thuki floats above every app, including fullscreen ones. Highlight text anywhere, double-tap Control βŒƒ, and Thuki opens with your selection pre-filled as a quote, ready to ask about.

thuki_demo_2.mp4

Why Thuki?

Most AI tools require accounts, API keys, or subscriptions that bill you per token. Thuki is different:

  • 100% free AI interactions: you run the model locally, there is no per-query cost, ever
  • Zero trust by design: no remote server, no cloud backend, no analytics, no telemetry
  • Works completely offline: once your model is pulled, Thuki runs without an internet connection
  • Your data is yours: conversations are stored in a local SQLite database on your machine and nowhere else
  • Most importantly: it works everywhere. Double-tap Control βŒƒ and Thuki appears on your desktop, inside a browser, inside a terminal, and yes, even in fullscreen apps. Your favorite AI chat apps can't do that!

Features

  • Always available: double-tap Control βŒƒ to summon the overlay from any app, including fullscreen apps
  • Context-aware quotes: highlight any text, then double-tap Control βŒƒ to open Thuki with the selected text pre-filled as a quote
  • Throwaway conversations: fast, lightweight interactions without the overhead of a full chat app
  • Conversation history: persist and revisit past conversations across sessions
  • Fully local LLM: powered by Ollama; no API keys, no accounts, no cost per query
  • Isolated sandbox: optionally run models in a hardened Docker container with capability dropping, read-only volumes, and localhost-only networking
  • Image input: paste or drag images and screenshots directly into the chat
  • Screen capture: type /screen to instantly capture your entire screen and attach it to your question as context
  • Agentic search: type /search to run a fully local, multi-step search pipeline (SearXNG + Trafilatura reader) with a live trace of every query, fetch, and judgement step
  • Slash commands: built-in commands for live search and prompt shortcuts: /search, /translate, /rewrite, /tldr, /refine, /bullets, /todos. Highlight text anywhere, summon Thuki, type a command, and hit Enter
  • Extended reasoning: type /think to have the model reason through a problem step by step before answering
  • In-app model picker: browse the models installed in your local Ollama and switch the active model from the ask bar without ever opening a config file
  • Cross-model continuity: swap models mid-conversation and Thuki sanitizes history and filters capabilities (vision, thinking) to whatever the new model supports
  • Settings panel: a four-tab native window (⌘,) for inference, prompt, window, and search settings, including a log-scale context-window slider and a tunable image-attachment cap (up to 20)
  • Contextual tip bar: lightweight in-overlay hints surface the right shortcut or command at the right moment
  • Privacy-first: zero-trust architecture, all data stays on your device

Getting Started

Step 1: Set Up Your AI Engine

Default model: Thuki ships with gemma4:e2b by default, an effective 2B parameter edge model from Google. It runs comfortably on most modern Macs with 8 GB of RAM and delivers strong performance on reasoning, coding, and vision tasks. The ask-bar model picker lists the models currently installed in your local Ollama and lets you switch the active model without leaving the overlay. To change the bootstrap default itself, edit ~/Library/Application Support/com.quietnode.thuki/config.toml and reorder the [model] available list so your preferred model is first. See Configurations for the full schema.

Choose one of the two options below to set up your AI engine before installing Thuki.

Option A: Local Ollama (Recommended for most users)

Ollama runs AI models directly on your Mac. It's free, open-source, and takes about 5 minutes to set up.

  1. Install Ollama

    Download and install from ollama.com, or via Homebrew:

    brew install ollama
  2. Pull a model

    ollama pull gemma4:e2b

    Note: Model files are large (typically 2–8 GB). This step can take several minutes depending on your internet connection. You only need to do it once.

  3. Verify the model is ready

    ollama list

    You should see your model listed. Once it appears, Ollama is ready and Thuki will connect to it automatically at http://127.0.0.1:11434.

Option B: Docker Sandbox (For security-conscious users)

Prerequisites: Install Docker Desktop

The Docker sandbox is for users who want the strongest possible isolation between the AI model and their host system, ideal if you work in regulated environments, are security-conscious about what runs on your machine, or simply want peace of mind. The model runs in a hardened container that cannot reach the internet, cannot write to your filesystem, and leaves no trace when stopped.

Start the sandbox:

bun run sandbox:start

First run: The sandbox will pull the model inside the container; this may take several minutes depending on your connection. Subsequent starts are instant.

When you're done, stop and wipe all model data:

bun run sandbox:stop

For the full architecture and security philosophy behind the sandbox, see sandbox/README.md.

Step 2: Setup the search sandbox (Optional, required for /search)

The /search command uses an agentic search pipeline that depends on two local Docker containers: a SearXNG meta-search engine and a Trafilatura reader. This setup ensures that your search queries and the content you read remain entirely local.

Prerequisite: Docker Desktop must be running.

  1. Start the search services

    bun run search-box:start
  2. Verify services (Optional)

    # Search Engine check:
    curl "http://127.0.0.1:25017/search?q=thuki&format=json"

    Without this service running, the /search command will be disabled in the chat, but all other features will remain available.

    For more details on the agentic search pipeline, see docs/agentic-search.md.

Step 3: Install Thuki

Download (Recommended)

  1. Download Thuki.dmg from the latest stable release, or grab the bleeding-edge build from the nightly channel which is rebuilt automatically from main.

  2. Double-click Thuki.dmg to open it. A window appears showing the Thuki app icon next to an Applications folder shortcut.

  3. Drag Thuki onto the Applications folder shortcut.

  4. Eject the disk image (drag it to Trash in the Finder sidebar, or right-click and choose Eject).

  5. Before opening Thuki for the first time, run this command in Terminal:

    xattr -rd com.apple.quarantine /Applications/Thuki.app

    Why is this needed? Thuki is a free, non-profit, open-source app distributed directly and not through the Mac App Store. Apple's Gatekeeper automatically blocks any app downloaded from the internet that has not gone through Apple's paid notarization process. This one-time command removes that block. It is safe and officially documented by Apple.

  6. Open Thuki. It will appear in your menu bar.

First launch: macOS will ask for Accessibility permission. This is required for the global keyboard shortcut that lets you summon Thuki from any app. Grant it once; it persists across restarts.

Build from Source

Prerequisites: Bun, Rust, and optionally Docker

# Clone and install dependencies
git clone https://github.com/quiet-node/thuki.git
cd thuki
bun install

# Launch in development mode
bun run dev

See CONTRIBUTING.md for the full development setup guide.

Architecture & Security

Click to expand

Thuki is a Tauri v2 app (Rust backend + React/TypeScript frontend) that interfaces with a locally running Ollama instance at http://127.0.0.1:11434.

Dual-Layer Isolation

  1. Frontend (Tauri/React): Operates within a secure system webview with restricted IPC. Streaming uses Tauri's Channel API; the Rust backend sends typed StreamChunk enum variants, and the frontend hook accumulates tokens into React state.

  2. Generative Engine (Docker Sandbox):

    • Ingress Isolation: The API is bound to 127.0.0.1 only, blocking all external network access
    • Privilege Dropping: All Linux kernel capabilities are dropped (cap_drop: ALL)
    • Model Integrity: Model weights are mounted read-only (:ro) to prevent tampering
    • Ephemeral State: All model data is purged on shutdown via docker compose down -v

Window Lifecycle

The app starts hidden. The hotkey or tray menu shows it. The window close button hides (not quits); quit is only available from the tray. ActivationPolicy::Accessory hides the Dock icon. macOSPrivateApi: true enables NSPanel for fullscreen-app overlay.

Configuration

Thuki reads a single typed TOML file at ~/Library/Application Support/com.quietnode.thuki/config.toml, seeded with sensible defaults on first launch. The in-app Settings panel (⌘,) writes to the same file, so you can edit by hand or click through tabs, whichever you prefer.

See docs/configurations.md for the full schema covering the [inference], [prompt], [window], [quote], and [search] sections (Ollama URL, system prompt, context window, image cap, agentic-search timeouts, and more).

See docs/commands.md for the full slash command reference, and docs/tuning-context-window.md for guidance on picking a num_ctx value.

Contributing

Contributions are welcome! Read CONTRIBUTING.md to get started. Please follow the Code of Conduct.

Community Ports

Thuki is macOS-only, but the community has been busy bringing it to other platforms. Huge shoutout to these contributors πŸŽŠπŸš€!

Platform Repo Author
Windows 10/11 ThukiWin @ayzekhdawy

Each port is independently maintained by its author. For issues or questions about a specific port, head to that repo directly.

Author

Reach out to Logan on X with questions or feedback.

What's next for Thuki

Thuki is just getting started. Here's where it's headed:

Secretary Superpowers

The big leap: from answering questions to taking action.

  • Tool integrations via MCP: connect Thuki to Gmail, Slack, Discord, Google Calendar, and any other MCP-compatible service; ask it to draft a reply, summarize a thread, or schedule a meeting without ever leaving your current app
  • More slash commands: more domain-specific commands on top of the existing /search, /screen, /think, /translate, /rewrite, /tldr, /refine, /bullets, and /todos

Better AI Control

More flexibility over the model powering Thuki.

  • Multiple provider support: opt in to OpenAI, Anthropic, or any OpenAI-compatible endpoint as an alternative to local Ollama
  • Custom activation shortcut: change the double-tap trigger to any key or combo you prefer

Richer Context

Give Thuki more to work with.

  • Voice input: dictate your question instead of typing
  • Auto-capture screen context: activate Thuki and have it automatically read the active window or selected region as context (partial: /screen captures the full screen today; targeted region capture is next)
  • File and document drop: drag a PDF, image, or text file directly into Thuki as context for your question

Have a feature idea? Open an issue and let's talk about it.

License

Copyright 2026 Logan Nguyen. Licensed under the Apache License, Version 2.0.

About

context-ware floating secretary

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors