Getting Started
This guide walks you through installing Integradio, configuring Ollama for local embeddings, and building your first semantically-searchable Gradio app.
Prerequisites
Section titled “Prerequisites”- Python 3.10+
- Gradio 4.0+ (compatible with Gradio 5.x and 6.x)
- Ollama installed and running locally
Install Integradio
Section titled “Install Integradio”# Basic installationpip install integradio
# With all optional dependencies (visualization, FastAPI routes, etc.)pip install "integradio[all]"
# Development installation (from source)pip install -e ".[dev]"Set up Ollama
Section titled “Set up Ollama”Integradio uses Ollama to generate vector embeddings locally. No cloud APIs required.
# Install Ollama from https://ollama.ai/# Then pull the embedding model:ollama pull nomic-embed-text
# Start the Ollama server (if not already running):ollama serveThe default embedding model is nomic-embed-text. It runs on your GPU and produces 768-dimensional vectors suitable for semantic search.
Wrap your first component
Section titled “Wrap your first component”The semantic() function wraps any Gradio component with an intent string. This intent gets embedded into a vector, making the component discoverable by meaning.
import gradio as grfrom integradio import SemanticBlocks, semantic
with SemanticBlocks() as demo: # Wrap a Textbox with semantic intent query = semantic( gr.Textbox(label="Search Query"), intent="user enters search terms" )
# Wrap a Button search_btn = semantic( gr.Button("Search"), intent="triggers the search operation" )
# Wrap an output area results = semantic( gr.Markdown(), intent="displays search results" )
# Wire up events as usual search_btn.click(fn=lambda q: f"Results for: {q}", inputs=query, outputs=results)
demo.launch()Search by meaning
Section titled “Search by meaning”Once components are wrapped, you can search for them semantically:
# Find all components related to "user input"matches = demo.search("user input", k=5)for match in matches: print(f"{match.label} (score: {match.score:.3f})")
# Get the single best matchbest = demo.find("where does the user type?")print(best.label) # "Search Query"
# Print a summary of all registered componentsprint(demo.summary())Configuration options
Section titled “Configuration options”SemanticBlocks accepts several configuration parameters:
with SemanticBlocks( db_path=None, # SQLite path for persistent storage (None = in-memory) cache_dir=None, # Directory for embedding cache ollama_url="http://localhost:11434", # Ollama server URL embed_model="nomic-embed-text", # Embedding model name) as demo: ...Next steps
Section titled “Next steps”- Learn about Semantic Wrappers for complex components like Chatbot, ImageEditor, and plots
- Explore Page Templates for ready-to-use layouts
- Set up Visualization to see your component graphs