Skip to content

Development Setup

ToolVersionPurpose
Node.js22+Frontend build
pnpm10+Package manager
Ruststable (1.75+)Tauri backend
Tauri CLI2.xDesktop app build

On Windows, you also need the Visual Studio C++ Build Tools and WebView2 runtime (included in Windows 11).

Terminal window
git clone https://github.com/mcp-tool-shop-org/commandui.git
cd commandui
pnpm install
Terminal window
pnpm dev

Opens at http://localhost:5176. Uses the mock bridge — all backend operations are simulated. No Rust compilation needed. Good for UI development.

Terminal window
cd apps/desktop
pnpm tauri:dev

Compiles the Rust backend and launches the desktop app with hot-reload. First build takes several minutes (Rust compilation). Subsequent builds are fast.

CommandScopeWhat it does
pnpm devrootVite dev server (browser preview)
pnpm typecheckrootTypeScript check across all packages
pnpm testrootRun all Vitest tests
pnpm buildrootProduction build (TypeScript + Vite)
cd apps/desktop && pnpm tauri:devdesktopFull Tauri dev app
cd apps/desktop && pnpm tauri:builddesktopProduction desktop build
cd apps/desktop/src-tauri && cargo testbackendRust unit tests

The app detects your shell:

  • Windows: uses COMMANDUI_WINDOWS_SHELL env var if set, otherwise falls back to PowerShell (pwsh or powershell.exe)
  • Unix: reads $SHELL environment variable
OSPath
Windows%APPDATA%/com.commandui.desktop/
macOS~/Library/Application Support/com.commandui.desktop/
Linux~/.local/share/com.commandui.desktop/

SQLite database is auto-created on first launch.

commandui/
apps/desktop/
src/ — React frontend
app/AppShell.tsx — Central orchestrator
components/ — UI components
features/ — Terminal/planner clients
hooks/ — Custom React hooks
lib/ — Utilities, mock bridge, shortcuts
styles/globals.css — All CSS
src-tauri/
src/ — Rust backend
commands/ — Tauri command handlers
ollama.rs — LLM integration
state.rs — Shared app state
packages/
domain/ — Pure TypeScript types
api-contract/ — Request/response contracts
state/ — Zustand stores
ui/ — Shared UI primitives (future)

For real LLM-powered planning, install Ollama and pull a model:

Terminal window
ollama pull llama3.2

The backend connects to Ollama at http://localhost:11434 by default. If Ollama is unavailable, the planner falls back to mock responses.