HomeGenie
SERVER 2.0 — Documentation

Local AI

HomeGenie 2.0 introduces the era of fully local Agentic AI. By combining Generative Intelligence with a Programmable UI, your server evolves into an autonomous coordinator. It doesn’t just respond to commands: it reasons, suggests, and acts. All while running 100% offline.

Meet Lailama

Lailama is HomeGenie’s Neural Core. It runs state-of-the-art GGUF models locally, transforming a simple chatbot into an Autonomous Agent capable of managing your entire Intelligent System.

Edge Intelligence

The heart of this system is the Fluent API Bus di HomeGenie, a high-level abstraction layer that allows the Lailama Reasoning Core to interact with the physical and digital world through a unified language.

Workflow:

  1. Intent Capture: Human language intents are received via chat, voice, or the Scheduler (recurring Agentic Tasks).
  2. Autonomous Reasoning: Lailama processes the intent locally, considering the real-time system context (weather, power, sensors).
  3. Fluent API Mapping: The AI's decision is translated into Universal Fluent API commands. This bus acts as a bridge between high-level logic and low-level execution.
  4. Hybrid Execution Layer:
    • Physical Nodes / Robots: Direct control of hardware protocols (ZigBee, MQTT, Z-Wave, GPIO) and kinetic robotic systems.
    • Digital Services: The Fluent API seamlessly manages WebServices, System Programs, and Automation Scripts.
graph TB
    %% Definizione Stili (Light Blueprint Palette)
    classDef default font-family:sans-serif, color:#334155, white-space: nowrap;
    classDef edgeSystem fill:#f8fafc,stroke:#0ea5e9,stroke-width:2px,stroke-dasharray: 8 4;
    classDef reasoning fill:#fffbeb,stroke:#d97706,stroke-width:3px,color:#92400e;
    classDef execution fill:#f0f9ff,stroke:#0284c7,stroke-width:2px,color:#0369a1;
    classDef hardware fill:#ffffff,stroke:#0ea5e9,stroke-width:1px,stroke-dasharray: 4;
    classDef services fill:#ffffff,stroke:#7c3aed,stroke-width:1px,stroke-dasharray: 4;
    classDef privacy fill:#f0fdf4,stroke:#16a34a,stroke-width:2px,color:#15803d;
    classDef cloud fill:#fef2f2,stroke:#dc2626,stroke-width:2px,stroke-dasharray: 4,color:#b91c1c;
    classDef input fill:#ffffff,stroke:#64748b,stroke-width:1.5px;

    subgraph GLOBAL_TITLE [HomeGenie Agentic AI System v2.0]
        direction TB

        subgraph LOCAL_STACK [AI EDGE STACK]
            direction TB
            
            %% Input Layer
            subgraph INPUT_LAYER [ ]
                direction LR
                USER[USER INTENT 
VOICE/TEXT/UI] SCHED[SCHEDULER
GENIE COMMAND] end %% Core LAILAMA{{LAILAMA
AGENTIC REASONING}} %% Orchestrator BUS(FLUENT API BUS
HOMEGENIE API) %% Execution Layer subgraph HYBRID_EXECUTION [HYBRID EXECUTION LAYER] direction LR subgraph HW [HARDWARE / ROBOTS] direction TB H1[● LIGHTING SYSTEMS] H2[■ CLIMATE CONTROL] H3[▲ KINETIC NODES] end subgraph DS [DIGITAL SERVICES] direction TB S1["{} CUSTOM SCRIPTING"] S2["[] SYSTEM PROGRAMS"] S3["<> EXTERNAL WEB APIs"] end end end %% Security Wall WALL{{PRIVACY ISOLATION WALL}} %% External subgraph EXTERNAL [PUBLIC CLOUD] ZONE[UNTRUSTED ZONE
DATA HARVESTING
LATENCY RISKS
] end end %% Connections USER --> LAILAMA SCHED --> LAILAMA LAILAMA ==>| Intent Mapping | BUS BUS --> HW BUS --> DS LOCAL_STACK -.-x| - | WALL WALL -.-x EXTERNAL %% Assegnazione Classi class LOCAL_STACK edgeSystem; class LAILAMA reasoning; class BUS execution; class HW hardware; class DS services; class USER,SCHED input; class WALL privacy; class EXTERNAL cloud; class H1,H2,H3,S1,S2,S3 default;

From Chat to Action

Agentic Tasks & Scheduling

The real power of Lailama lies in its ability to perform complex tasks autonomously.

Genie Command

In the HomeGenie Scheduler, set your trigger (time, date, or recurrence) using the standard calendar UI. Then, select the Genie Command action and simply type the goal you want to achieve in the text field.

Agentic-Widgets

The Lailama module emits tokens via the LLM.TokenStream property during inference. This token stream can be read by widgets (UI side) and programs (backend) and possibly used to trigger actions and create automations based on AI responses.

Developer API

Lailama exposes several endpoints for deep integration:

EndpointDescription
AI.IntentHandlers/Lailama/Prompt.Submit/{text}Stateful conversation added to persistent history.
AI.IntentHandlers/Lailama/Prompt.Schedule/{text}Background Agentic Task (ephemeral session).
AI.IntentHandlers/Lailama/Prompt.CancelAborts current AI generation.
AI.IntentHandlers/Lailama/Extract/{text}Parses text to execute embedded API commands.

Configuration

Lailama provides granular control to maintain speed and stability:

Architecture

HomeGenie implements a modular Agentic AI Stack. Intelligence is not a "black box" but a distributed set of specialized system services. This architecture leverages HomeGenie’s core philosophy: treating every device, service, or script as a unified Module.

1. Everything is a Module

In HomeGenie, the Module is the universal abstraction. Whether it’s a physical ZigBee bulb, a DHT-22 sensor on a GPIO, or the Lailama Neural Core, they all exist as modules.

2. The Service Stack

  1. Intent Handlers (@AI:IntentHandlers): This is the routing brain. It provides a Unified Intent Interface that abstracts the AI provider. It receives natural language and routes it to the active engine (Lailama for local edge processing or Gemini for cloud-based reasoning). This ensures your automations stay valid even if you swap the underlying LLM.
  2. Context Engine (The AI's Senses): An AI is only as smart as its data. The Context Engine is a specialized system utility that performs a Real-time System Scan. It aggregates the state of all modules into a structured Markdown briefing. It populates templates like %%AVAILABLE_DEVICES%% and %%SYSTEM_STATUS%% so the AI "sees" the house (weather, energy loads, security) before it thinks.
  3. Chat History Service (Persistent Memory): This is a centralized, multi-store service for conversation logs. It decouples memory from the AI engine, allowing for Session Persistence. You can restart the Lailama program or switch models without the Agent "forgetting" the previous context of the conversation.
  4. Async Download Manager (Lifecycle Control): Handling multi-gigabyte GGUF models requires resilience. The Download Manager service handles asynchronous transfers with support for HTTP Range headers. This allows for pausing, resuming, and real-time progress tracking of neural weights directly within the HomeGenie UI.
  5. Lailama Neural Core (The Reasoning Engine): Built on the LLamaSharp framework, this program bridges HomeGenie to local LLMs. It manages the KV Cache and implements Intelligent Memory Pruning. By using session.LoadSession(resetState), Lailama maintains sub-5s response times by performing "Hard Resets" when the history exceeds the MaxTurns limit, preventing performance degradation.

3. Programmable Agentic Logic

Lailama isn't hardcoded; it's a HomeGenie Automation Program. This means it uses the same APIs you use for simple tasks:

System Logic Flow

sequenceDiagram
    participant UI as UI / Scheduler
    participant L as Lailama Program
    participant CH as Chat History Service
    participant CE as Context Engine
    participant LS as LLamaSharp (Neural Core)

    UI->>L: Invoke Prompt.Submit / Schedule
    L->>CE: Request Rendered Context
    CE-->>L: Return Home Briefing
    L->>CH: Fetch/Sync History
    L->>LS: Run Inference (Briefing + History + Intent)
    LS-->>L: Stream Tokens / Response
    L->>L: Extract & Execute API Commands
    L->>CH: Update Persistent History
    L-->>UI: Final Response

Since Lailama is a native automation program, its core logic is fully accessible through the Integrated Code Editor directly within the HomeGenie UI.

You can refine how the Context Engine perceives the environment, tweak neural parameters, or define new Programmable Features for the AI to command without ever leaving the dashboard or using external development tools.

1. Full System Customization

Access and edit any automation program—including core system logic—directly from the UI for a truly tailored intelligent environment.


Open media

2. AI-Assisted Neural Tuning

Refine the Lailama C# engine in the integrated editor while collaborating with a dedicated AI developer assistant for real-time logic updates.


Open media

3. Real-Time UI Engineering

Build and preview bespoke AI widgets instantly with HTML, CSS, and zuix.js, bridging your neural core and user experience without external tools.


Open media

Hardware Requirements

More topics

Setup

Features

Automation

Programming

📖

🕵🏻 Explore HomeGenie DeepWiki

Technical Docs & Interactive Wiki

Ask the AI 🧠
menu_open Browse Content