Fam Zheng e4ba385112 refactor: worker mode — server offloads all LLM/exec to worker
- Split into `tori server` / `tori worker` subcommands (clap derive)
- Extract lib.rs for shared crate (agent, llm, exec, state, etc.)
- Introduce AgentUpdate channel to decouple agent loop from DB/broadcast
- New sink.rs: AgentUpdate enum + ServiceManager + handle_agent_updates
- New worker_runner.rs: connects to server WS, runs full agent loop
- Expand worker protocol: ServerToWorker (workflow_assign, comment)
  and WorkerToServer (register, result, update)
- Remove LLM from title generation (heuristic) and template selection
  (must be explicit)
- Remove KB tools (kb_search, kb_read) and remote worker tools
  (list_workers, execute_on_worker) from agent loop
- run_agent_loop/run_step_loop now take mpsc::Sender<AgentUpdate>
  instead of direct DB pool + broadcast sender
2026-04-06 12:54:57 +01:00
2026-03-02 09:21:46 +00:00

Tori — AI Agent 工作流管理器

AI agent 驱动的工作流管理 Web 应用。描述需求AI 规划agent 执行,随时通过 comment 反馈。

快速开始

# 开发模式(前后端同时启动)
make dev

# 构建生产版本
make build

# 部署到 OCI ARM 服务器
make deploy

配置

cp config.yaml.example config.yaml
# 编辑 config.yaml填入 LLM API key 等

技术栈

  • 后端: Rust (Axum) + SQLite
  • 前端: Vite + Vue 3 + TypeScript
  • LLM: OpenAI 兼容 APIRequesty.ai 网关)
  • 实时通信: WebSocket
  • 远程执行: SSH
Description
AI Agent Workflow Manager
Readme 441 KiB
Languages
Rust 63.7%
Vue 28.1%
Python 3.4%
TypeScript 2.3%
HTML 1.7%
Other 0.8%