Python client generation and AI agent execution mode for all services #18
Labels
No labels
prio_critical
prio_low
type_bug
type_contact
type_issue
type_lead
type_question
type_story
type_task
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
lhumina_code/hero_router#18
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Summary
Add two new capabilities to hero_router:
hero_router already discovers all services, caches their OpenRPC specs, and proxies RPC/MCP/admin traffic. These two features make it the central gateway for programmatic and AI-driven service interaction.
Feature 1: Python Client Generation
How It Works
When hero_router discovers a service and fetches its OpenRPC spec, it also generates a typed Python client. This is deterministic (template-based, no LLM needed).
Generation Pipeline
What Gets Generated
For each service, two files:
1.
{service}_client.py— Full typed Python client2.
{service}_interface.py— Lightweight stub for LLM consumptionServing
GET /:service/python/client.pyGET /:service/python/interface.pyrouter.python_client {service}→ returns client sourcerouter.python_interface {service}→ returns interface source~/.hero/var/router/python/{service}_client.pyAuto-Update
Implementation Options
Option A: Pure template-based (recommended for client)
Option B: LLM-assisted generation
Option C: Runtime translation (no pre-generation)
client = HeroClient("hero_books")→ fetches spec → creates methodsRecommendation: Option A for the static client + Option C as a convenience wrapper. The static client is reliable and works offline. The dynamic wrapper is nice for interactive/notebook use.
Feature 2: AI Agent Execution Mode
How It Works
User sends a prompt describing what they want to do with a service. hero_router:
uv(Python runner)Endpoint
Or via JSON-RPC:
Response
Execution Pipeline
LLM Integration
Prior Art
This pattern is already proven in
hero_aibroker/crates/mcp/mcp_hero/:codegen.rs:generate_client_from_spec()— Python client from OpenRPCcodegen.rs:generate_code()— LLM script generation with error context for retriesexecutor.rs:execute_with_retry()— Python execution with LLM-driven error recoveryexecutor.rs— manages Python venv at~/.hero/var/aibroker/pythonThe implementation should port and adapt this logic into hero_router.
Implementation Plan
Phase 1: Python Client Generator
python_codegenmodule to herolib_router~/.hero/var/router/python/Phase 2: Agent Executor
agentmodule to herolib_routerPOST /:service/agentrouter.agent.runPhase 3: Admin UI Integration
Related
mcp_herocrate — prior art for Python codegen + executionImplementation Strategy
Starting work on branch
development_18.Phase 1: Python Client Generator (template-based, no LLM)
Key design decision: Pure template-based generation in Rust (Option A from the issue). This is deterministic, fast, zero-dependency, and doesn't require hero_aibroker to be running. The mcp_hero prior art uses LLM for client generation, but since OpenRPC specs are highly structured, template-based is the right approach here.
Module:
crates/hero_router/src/python_codegen.rsClient generation pipeline:
info(title, version, description) andmethodsarray__init__(socket_path)with default path to service socket_call(method, params)using HTTP-over-Unix-socket (stdlib only:http.client+socket)string→str,integer→int,boolean→bool,array→list,object→dict)Interface generation: Lightweight comment-based stub file, one line per method — designed for LLM context windows.
Caching: Files cached at
~/.hero/var/router/python/{service}_client.pyand{service}_interface.py. Regenerated when spec hash changes (detected during scanner refresh).Endpoints:
GET /:service/python/client.py→ serve cached clientGET /:service/python/interface.py→ serve cached interfacerouter.python_client {service}androuter.python_interface {service}Phase 2: AI Agent Executor
Module:
crates/hero_router/src/server/agent.rsPorted from mcp_hero patterns:
~/.hero/var/router/python/venv/usinguv$HERO_SOCKET_DIR/hero_aibroker/rpc.sock) usingai.chatmethodEndpoints:
POST /:service/agentwith{prompt, max_retries}router.agent.run {service, prompt, max_retries}Response:
{success, result, script, attempts, duration_ms, error}Implementation Order
python_codegen.rs— core generation logicagent.rs— executor + LLM integrationopenrpc.jsonwith new methodsImplementation Complete — Phase 1 & Phase 2
Branch:
development_18— pushed to remote.Phase 1: Python Client Generator
New file:
crates/hero_router/src/python_codegen.rs(540 lines)Template-based generation (no LLM required):
generate_python_client()— Full typed Python client classhttp.client+socket)string→str,integer→int, etc.)generate_python_interface()— Lightweight stub for LLM context~/.hero/var/router/python/with hash-based invalidationEndpoints:
GET /:service/python/client.py→ serve typed clientGET /:service/python/interface.py→ serve interface stubrouter.python_client {service}→ return client sourcerouter.python_interface {service}→ return interface sourcePhase 2: AI Agent Executor
New file:
crates/hero_router/src/server/agent.rs(370 lines)Ported from mcp_hero patterns:
uvpreferred,python3 -m venvfallback)ai.chatmethod)Endpoints:
POST /:service/agentwith body{"prompt": "...", "max_retries": 3}router.agent.run {service, prompt, max_retries}{success, result, script, attempts, duration_ms, error}Modified files
lib.rs— registeredpython_codegenmoduleserver/mod.rs— registeredagentmodulescanner.rs— addedrefresh_python_cache()after each scanserver/routes.rs— addedpythonandagentwebname handlersserver/rpc.rs— added 3 new RPC methodsstatic/openrpc.json— added method specs for all 3 new methodsBuild status
Status Update: End-to-End Working
All three features are now fully functional:
1. Python Client Generation
from typing import Anyimport fix applied — was causingNameErrorat runtime/:service/python/client.pyand/:service/python/interface.py2. AI Agent Execution
3. UI Tabs
Configuration Required
hero_aibroker needs an OpenRouter API key:
Commit:
e651ddbondevelopment_18