6 Commits

Author SHA1 Message Date
Nico
b6ca02f864 v0.9.2: dedicated UI node, strict node roles, markdown rendering
6-node pipeline: Input -> Thinker -> Output (voice) + UI (screen) in parallel

- Output: text only (markdown, emoji). Never emits HTML or controls.
- UI: dedicated node for labels, buttons, tables. Tracks workspace state.
  Replaces entire workspace on each update. Runs parallel with Output.
- Input: strict one-sentence perception. No more hallucinating responses.
- Thinker: controls removed from prompt, focuses on reasoning + tools.
- Frontend: markdown rendered in chat (bold, italic, code blocks, lists).
  Label control type added. UI node meter in top bar.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-28 14:12:15 +01:00
Nico
f6939d47f5 v0.8.5: smart Output renderer + awareness panel
Output node upgraded from dumb echo to device-aware renderer:
- Knows it's rendering to HTML/browser, uses markdown formatting
- Receives full ThoughtResult (response + tool output + controls)
- Always in pipeline: Input perceives, Thinker reasons, Output renders
- Keeps user's language, weaves tool results into natural responses

Awareness panel (3-column layout):
- State: mood, topic, language, facts from Memorizer
- Sensors: clock, idle, memo deltas from Sensor ticks
- Processes: live cards with cancel during tool execution
- Workspace: docked controls (tables/buttons) persist across messages

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-28 02:02:41 +01:00
Nico
8b69e6dd0d v0.6.2: Thinker node with python tool execution (S3 Control)
- ThinkerNode: reasons about perception, decides tool use vs direct answer
- Python tool: subprocess execution with 10s timeout
- Auto-detects python code blocks in LLM output and executes them
- Tool call/result visible in trace + HUD
- Thinker meter in frontend (token budget: 4K)
- Flow: Input (perceive) -> Thinker (reason + tools) -> Output (speak)
- Tested: math (42*137=5754), SQLite (create+query), time, greetings

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-28 01:04:22 +01:00
Nico
5c7aece397 v0.5.5: node token meters in frontend
- Per-node context fill bars (input/output/memorizer/sensor)
- Color-coded: green <50%, amber 50-80%, red >80%
- Sensor meter shows tick count + latest deltas
- Token info in trace context events

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-28 00:51:43 +01:00
Nico
ab661775ef v0.5.4: sensor node, perceiver model, context budgets, API send
- SensorNode: 5s tick loop with delta-only emissions (clock, idle, memo changes)
- Input reframed as perceiver (describes what it heard, not commands)
- Output reframed as voice (acts on perception, never echoes it)
- Per-node token budgets: Input 2K, Output 4K, Memorizer 3K
- fit_context() trims oldest messages to stay within budget
- History sliding window: 40 messages max
- Facts capped at 20, trace file rotates at 500KB
- /api/send + /api/clear endpoints for programmatic testing
- test_cog.py test suite
- Listener context: physical/social/security awareness

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-28 00:42:02 +01:00
Nico
569a6022fe cognitive agent runtime v0.4.6: 3-node graph + Zitadel auth + K3s deploy
- Input/Output/Memorizer nodes with OpenRouter (Gemini Flash)
- Zitadel OIDC auth with PKCE flow, service token for Titan
- SSE event stream + poll endpoint for external observers
- Identity from Zitadel userinfo, listener context in Input prompt
- Trace logging to file + SSE broadcast
- K3s deployment on IONOS with Let's Encrypt TLS
- Frontend: chat + trace view, OIDC login

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-27 23:21:51 +01:00