Data Flow

Message Processing Flow

  1. User Message Submission:

    • A user sends a message via the chat interface.

  2. Routing via API Gateway:

    • The message is forwarded to the backend server.

  3. Hybrid Inference Processing:

    • The Hybrid Inference Engine selects the appropriate inference backend (OpenAI, Gemini, Anthropic API, local LLM, RAG, or external connector) and processes the message.

  4. Quality Scoring:

    • A quality scoring pass evaluates the message concurrently.

  5. Response Delivery:

    • The agent’s response is returned to the chat interface.

  6. Memory Synchronization:

    • For imported agents, the memory sharing protocol updates the agent’s memory state.

  7. Reward Trigger:

    • High-quality messages trigger token rewards via the Reward Distribution Module.