- gen_voice: IndexTTS2 voice cloning via tools/gen_voice script, ref audio
cached on server to avoid re-upload
- Message timestamps: created_at column in messages table, prepended to
content in API calls so LLM sees message times
- Image understanding: photos converted to base64 multimodal content
for vision-capable models
- Group chat: independent session contexts per chat_id, sendMessageDraft
disabled in groups (private chat only)
- Voice transcription: whisper service integration, transcribed text
injected as [语音消息] prefix
- Integration tests marked #[ignore] (require external services)
- Reference voice asset: assets/ref_voice.mp3
- .gitignore: target/, noc.service, config/state/db files
- "cc" prefix messages bypass LLM backend and history, directly invoke claude -p
- diag command now dumps all registered tools and sends as .md file
- system prompt instructs LLM to use spawn_agent for search tasks
- spawn_agent tool description updated to mention search/browser capabilities
- Configurable backend: claude (CLI) or openai (API), selected in config.yaml
- OpenAI streaming via SSE with conversation history in memory
- Session isolation: config name included in session UUID
- Markdown to Telegram HTML conversion (pulldown-cmark) for final messages
- Fix sendMessageDraft: skip cursor to preserve monotonic text growth,
skip empty content chunks from SSE stream
- Simplify Makefile: single deploy target
Telegram Bot API 9.3+ sendMessageDraft provides smooth streaming text
rendering without the flickering of repeated edits. Falls back to
editMessageText automatically if the API is unavailable (e.g. older
clients or group chats). Also reduces edit interval from 5s to 3s and
uses 1s interval for draft mode.
- Streaming: use claude --output-format stream-json, edit TG message
every 5s with progress, show tool use status during execution,
◎ cursor indicator while processing
- File transfer: download user uploads to ~/incoming/, scan
~/outgoing/{sid}/ for new files after claude completes
- Error handling: wrap post-auth logic in handle_inner, all errors
reply to user instead of silently failing
- Remote deploy: make deploy-hera via SSH, generate service from
template with dynamic PATH/REPO
- Service: binary installed to ~/bin/noc, WorkingDirectory=%h
- Invoke claude directly instead of ms wrapper
- Session state persisted to disk across restarts
Async Rust bot (teloxide + tokio) that:
- Authenticates users per chat with a passphrase (resets daily at 5am)
- Generates deterministic UUID v5 session IDs from chat_id + date
- Pipes messages to `claude -p --session-id/--resume <uuid>`
- Persists auth and session state to disk across restarts
- Deploys as systemd --user service via `make deploy`