Nico a2bc6347fc v0.13.0: Graph engine, versioned nodes, S3* audit, DB tools, Cytoscape
Architecture:
- Graph engine (engine.py) loads graph definitions, instantiates nodes
- Versioned nodes: input_v1, thinker_v1, output_v1, memorizer_v1, director_v1
- NODE_REGISTRY for dynamic node lookup by name
- Graph API: /api/graph/active, /api/graph/list, /api/graph/switch
- Graph definition: graphs/v1_current.py (7 nodes, 13 edges, 3 edge types)

S3* Audit system:
- Workspace mismatch detection (server vs browser controls)
- Code-without-tools retry (Thinker wrote code but no tool calls)
- Intent-without-action retry (request intent but Thinker only produced text)
- Dashboard feedback: browser sends workspace state on every message
- Sensor continuous comparison on 5s tick

State machines:
- create_machine / add_state / reset_machine / destroy_machine via function calling
- Local transitions (go:) resolve without LLM round-trip
- Button persistence across turns

Database tools:
- query_db tool via pymysql to MariaDB K3s pod (eras2_production)
- Table rendering in workspace (tab-separated parsing)
- Director pre-planning with Opus for complex data requests
- Error retry with corrected SQL

Frontend:
- Cytoscape.js pipeline graph with real-time node animations
- Overlay scrollbars (CSS-only, no reflow)
- Tool call/result trace events
- S3* audit events in trace

Testing:
- 167 integration tests (11 test suites)
- 22 node-level unit tests (test_nodes/)
- Three test levels: node unit, graph integration, scenario

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-03-29 00:18:45 +01:00

89 lines
2.9 KiB
Python

"""LLM helper: OpenRouter calls, token estimation, context fitting."""
import json
import logging
import os
from typing import Any
import httpx
log = logging.getLogger("runtime")
API_KEY = os.environ.get("OPENROUTER_API_KEY", "")
OPENROUTER_URL = "https://openrouter.ai/api/v1/chat/completions"
async def llm_call(model: str, messages: list[dict], stream: bool = False,
tools: list[dict] = None) -> Any:
"""Single LLM call via OpenRouter.
Returns full text, (client, response) for streaming, or (text, tool_calls) when tools are used."""
headers = {"Authorization": f"Bearer {API_KEY}", "Content-Type": "application/json"}
body = {"model": model, "messages": messages, "stream": stream}
if tools:
body["tools"] = tools
client = httpx.AsyncClient(timeout=60)
if stream:
resp = await client.send(client.build_request("POST", OPENROUTER_URL, headers=headers, json=body), stream=True)
return client, resp
resp = await client.post(OPENROUTER_URL, headers=headers, json=body)
await client.aclose()
data = resp.json()
if "choices" not in data:
log.error(f"LLM error: {data}")
error_msg = f"[LLM error: {data.get('error', {}).get('message', 'unknown')}]"
return (error_msg, []) if tools else error_msg
msg = data["choices"][0]["message"]
content = msg.get("content", "") or ""
tool_calls = msg.get("tool_calls", [])
if tools:
return content, tool_calls
return content
def estimate_tokens(text: str) -> int:
"""Rough token estimate: 1 token ~ 4 chars."""
return len(text) // 4
def fit_context(messages: list[dict], max_tokens: int, protect_last: int = 4) -> list[dict]:
"""Trim oldest messages (after system prompt) to fit token budget.
Always keeps: system prompt(s) at start + last `protect_last` messages."""
if not messages:
return messages
system_msgs = []
rest = []
for m in messages:
if not rest and m["role"] == "system":
system_msgs.append(m)
else:
rest.append(m)
protected = rest[-protect_last:] if len(rest) > protect_last else rest
middle = rest[:-protect_last] if len(rest) > protect_last else []
fixed_tokens = sum(estimate_tokens(m["content"]) for m in system_msgs + protected)
if fixed_tokens >= max_tokens:
result = system_msgs + protected
total = sum(estimate_tokens(m["content"]) for m in result)
while total > max_tokens and len(result) > 2:
removed = result.pop(1)
total -= estimate_tokens(removed["content"])
return result
remaining = max_tokens - fixed_tokens
kept_middle = []
for m in reversed(middle):
t = estimate_tokens(m["content"])
if remaining - t < 0:
break
kept_middle.insert(0, m)
remaining -= t
return system_msgs + kept_middle + protected