Skip to content

Getting Started

This guide walks you through installing Integradio, configuring Ollama for local embeddings, and building your first semantically-searchable Gradio app.

  • Python 3.10+
  • Gradio 4.0+ (compatible with Gradio 5.x and 6.x)
  • Ollama installed and running locally
Terminal window
# Basic installation
pip install integradio
# With all optional dependencies (visualization, FastAPI routes, etc.)
pip install "integradio[all]"
# Development installation (from source)
pip install -e ".[dev]"

Integradio uses Ollama to generate vector embeddings locally. No cloud APIs required.

Terminal window
# Install Ollama from https://ollama.ai/
# Then pull the embedding model:
ollama pull nomic-embed-text
# Start the Ollama server (if not already running):
ollama serve

The default embedding model is nomic-embed-text. It runs on your GPU and produces 768-dimensional vectors suitable for semantic search.

The semantic() function wraps any Gradio component with an intent string. This intent gets embedded into a vector, making the component discoverable by meaning.

import gradio as gr
from integradio import SemanticBlocks, semantic
with SemanticBlocks() as demo:
# Wrap a Textbox with semantic intent
query = semantic(
gr.Textbox(label="Search Query"),
intent="user enters search terms"
)
# Wrap a Button
search_btn = semantic(
gr.Button("Search"),
intent="triggers the search operation"
)
# Wrap an output area
results = semantic(
gr.Markdown(),
intent="displays search results"
)
# Wire up events as usual
search_btn.click(fn=lambda q: f"Results for: {q}", inputs=query, outputs=results)
demo.launch()

Once components are wrapped, you can search for them semantically:

# Find all components related to "user input"
matches = demo.search("user input", k=5)
for match in matches:
print(f"{match.label} (score: {match.score:.3f})")
# Get the single best match
best = demo.find("where does the user type?")
print(best.label) # "Search Query"
# Print a summary of all registered components
print(demo.summary())

SemanticBlocks accepts several configuration parameters:

with SemanticBlocks(
db_path=None, # SQLite path for persistent storage (None = in-memory)
cache_dir=None, # Directory for embedding cache
ollama_url="http://localhost:11434", # Ollama server URL
embed_model="nomic-embed-text", # Embedding model name
) as demo:
...