LogoAIAny
Icon for item

Wegent

Wegent is an open-source AI-native operating system for defining, organizing, and running agentic AI. It provides declarative YAML-driven agent teams, multiple execution engines, isolated sandboxed workspaces, advanced collaboration modes, and integrations for coding workflows and web search.

Introduction

Overview

Wegent is an open-source platform designed to enable developers and teams to define, organize, and run intelligent agents at scale. Built with a Kubernetes-style declarative API and CRD-inspired resource model, Wegent treats agents and their components (Ghost, Model, Shell, Bot, Team) as first-class resources that can be configured via YAML and managed through a web UI.

Core Capabilities
  • Configuration-driven agent teams: Define Teams (user-facing agents) using YAML; Teams are composed of Bots, which in turn combine Ghost (personality), Model (model configuration), and Shell (executable agent program).
  • Multi execution engines: Supports multiple runtimes/executors including Claude Code, Agno, Dify and a Chat Shell that can call LLM APIs directly (OpenAI, Claude, Gemini).
  • Isolated sandboxes & workspaces: Each agent team runs inside an isolated workspace so multiple teams can run concurrently without interference.
  • Advanced collaboration modes: Provides collaboration models such as parallel, leader-based, and solo modes to orchestrate multi-agent workflows for tasks like news analysis or complex data processing.
  • AI coding integration: Integrates with GitHub/GitLab to provide browser-based coding assistants, isolated development environments, and multi-agent coding workflows (e.g., automated code review, implementation tasks).
  • Web search & retrieval: Optional web search adapter supporting SearXNG, Google Custom Search, Bing, Brave, etc., for real-time information retrieval in Chat Shell teams.
Architecture Highlights

Wegent separates concerns into management, data, execution, and agent layers:

  • Frontend: Next.js web UI for team/agent management and workspace interactions.
  • Backend: FastAPI declarative API and orchestration layer.
  • Data layer: MySQL for persistent storage.
  • Execution layer: ExecutorManager and multiple Executors that run agent programs and communicate with model runtimes.
  • Agent runtimes: Integrations with Claude Code, Agno, Dify, and direct LLM API access.

The project also ships a kubectl-style CLI (wectl) and provides Docker/Docker Compose configs for quick local launches.

Quick Start (summary)
  1. Clone repository and start with Docker Compose:
  2. Open the web UI (default: http://localhost:3000).
  3. Configure provider API keys (OpenAI, Anthropic/Claude, Google for Gemini) via environment variables.
  4. Use the built-in default chat Team to try an instant chat or configure custom YAML Teams for production use.
Use Cases
  • Instant AI chat and multi-agent conversational systems.
  • Browser-based coding assistants integrated with GitHub for automated implementation and reviews.
  • News aggregation and multi-agent analysis pipelines.
  • DevOps automation and CI/CD tasks using agent-driven workflows.
  • Research assistants for literature review and synthesis using collaboration patterns.
Extensibility & Developer Experience

Wegent is designed to be extended via new Shell executors and collaboration models. The codebase is split into backend (FastAPI), frontend (Next.js), executor components and shared utilities, with a development guide and tests provided. It supports environment-variable driven model authentication patterns and documents differences between runtimes (e.g., Claude vs Agno vs Chat Shell variable names).

Contribution & Community

The repository includes contributor guidance (CONTRIBUTING.md), issue tracker, and a list of active contributors. The project is structured for contributions across backend, frontend, and executor modules.

Summary

Wegent provides a declarative, extensible framework for orchestrating agent-based AI applications, focusing on reproducible configuration, sandboxed execution, multi-runtime support, and developer-friendly integrations for coding and information retrieval.

Information

  • Websitegithub.com
  • AuthorsWeCode-AI Team
  • Published date2025/09/04