IC InControl
Open source

Your AI. Your machine.

A privacy-first, GPU-accelerated chat application that runs large language models entirely on your Windows PC. No cloud required.

Install

Download MSIX from Releases → double-click → launch

Build

git clone … && dotnet restore && dotnet build

Run

dotnet run --project src/InControl.App

Features

Local AI chat that respects your privacy.

Private by default

Your conversations never leave your computer. All data stored locally — no cloud, no telemetry.

RTX-accelerated

Built for NVIDIA GPUs with CUDA acceleration. Targets RTX 3060+ with 8–16GB VRAM.

Native Windows

WinUI 3 with Fluent Design. Looks and feels like a real Windows app, not an Electron wrapper.

Multi-backend

Ollama, llama.cpp, or bring your own. Swap backends without changing your workflow.

Markdown rendering

Rich text, code blocks, and syntax highlighting in every response.

NuGet libraries

Core and Inference packages available on NuGet for building your own local AI integrations.

Installation

From Release (recommended)

# 1. Download latest MSIX from GitHub Releases
# 2. Double-click to install
# 3. Launch from Start Menu

# Prerequisite: Ollama
# https://ollama.ai/download
ollama pull llama3.2
ollama serve

From Source

git clone https://github.com/mcp-tool-shop-org/InControl-Desktop.git
cd InControl-Desktop
dotnet restore
dotnet build

# Run (requires Ollama running locally)
dotnet run --project src/InControl.App

NuGet Packages

Add packages

dotnet add package InControl.Core
dotnet add package InControl.Inference

Use in your app

// Stream chat with a local LLM
var client = inferenceClientFactory.Create("ollama");
await foreach (var token in client.StreamChatAsync(messages))
{
    Console.Write(token);
}

Target Hardware

Component
Minimum
Recommended
GPU
RTX 3060 (8GB)
RTX 4080/5080 (16GB)
RAM
16GB
32GB
OS
Windows 10 1809+
Windows 11
.NET
9.0
9.0

Architecture

Layer
Technology
UI Framework
WinUI 3 (Windows App SDK 1.6)
Architecture
MVVM with CommunityToolkit.Mvvm
LLM Integration
OllamaSharp, Microsoft.Extensions.AI
DI Container
Microsoft.Extensions.DependencyInjection
Configuration
Microsoft.Extensions.Configuration
Logging
Microsoft.Extensions.Logging + Serilog