- TypeScript 75.1%
- HTML 12.2%
- Shell 11.1%
- JavaScript 1.1%
- Makefile 0.5%
|
Some checks failed
CI / check (push) Failing after 2s
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> |
||
|---|---|---|
| .forgejo/workflows | ||
| docs | ||
| packages | ||
| scripts | ||
| src | ||
| tests | ||
| .env.example | ||
| .gitignore | ||
| bun.lock | ||
| docker-compose.yml | ||
| Makefile | ||
| package.json | ||
| README.md | ||
| tsconfig.json | ||
Hero Researcher
AI-powered person research tool that orchestrates multiple LLM models and web search providers to generate comprehensive background reports. Built as a Bun workspace with a Unix-socket-based JSON-RPC server, a web UI, a CLI, and a TypeScript SDK.
Architecture
Hero Researcher is a 5-package Bun workspace:
| Package | Purpose |
|---|---|
hero_researcher |
Core research engine (LLM clients, search providers, scrapers, report formatting) |
hero_researcher_server |
JSON-RPC 2.0 server over a Unix domain socket; manages research jobs |
hero_researcher_sdk |
Typed client library for communicating with the server |
hero_researcher_ui |
Hono-based web UI that proxies requests to the server via the SDK |
hero_researcher_cli |
Terminal interface built with Commander.js and the SDK |
See docs/ARCHITECTURE.md for a detailed architecture diagram and data-flow description.
Quick Start
Prerequisites
- Bun v1.1+
- An OpenRouter API key
Install
git clone <repo-url> && cd hero_researcher
bun install
Set Environment Variables
export OPENROUTER_API_KEY=sk-or-...
Start the Server
# Terminal 1 - start the JSON-RPC server
make server
# or: cd packages/hero_researcher_server && bun run src/main.ts serve
Start the Web UI (optional)
# Terminal 2
make ui
Use the CLI
# Start a research job and wait for the result
make cli ARGS="research 'Jane Doe' --location 'San Francisco' --wait"
# List jobs
make cli ARGS="list"
# Get status
make cli ARGS="status <jobId>"
# Get result
make cli ARGS="result <jobId>"
# Cancel a job
make cli ARGS="cancel <jobId>"
Use the SDK Programmatically
import { HeroResearcherClient } from "hero_researcher_sdk";
const client = new HeroResearcherClient();
const job = await client.researchStart(
{ name: "Jane Doe", location: "San Francisco" },
{ tier: "standard", format: "markdown" },
);
Package Descriptions
hero_researcher -- The core research engine. Contains LLM clients (OpenRouter, AiBroker), search providers (Brave, SearXNG, DuckDuckGo, Exa, Serper, SerpAPI), platform-specific scrapers (LinkedIn, GitHub, Twitter, Reddit, etc.), report formatting (Markdown, JSON, HTML), and the run-job.ts entry point that the server spawns as a child process for each research job.
hero_researcher_server -- A JSON-RPC 2.0 server that listens on a Unix domain socket (~/hero/var/sockets/hero_researcher_server.sock). It accepts newline-delimited JSON-RPC requests, manages job lifecycle (pending, running, completed, failed, cancelled), spawns researcher child processes, and persists job state to disk as JSON files.
hero_researcher_sdk -- A typed TypeScript client that connects to the server over the Unix socket. Exports HeroResearcherClient with methods like researchStart, researchStatus, researchGet, researchList, and researchCancel. Also exports all shared domain types (PersonInfo, ResearchJob, ResearchResult, etc.).
hero_researcher_ui -- A Hono web application that serves a browser-based UI and exposes REST endpoints (/api/research, /api/jobs, etc.) plus a raw JSON-RPC proxy at /rpc. Listens on its own Unix socket and communicates with the server through the SDK.
hero_researcher_cli -- A Commander.js-based terminal interface that uses the SDK to interact with the server. Supports commands: research, status, result, list, and cancel. Includes a polling spinner for --wait mode.
Configuration
Environment Variables
| Variable | Description | Required |
|---|---|---|
OPENROUTER_API_KEY |
OpenRouter API key for LLM queries | Yes (unless using AiBroker) |
OPENROUTER_BASE_URL |
Override the OpenRouter base URL | No |
OPENROUTER_TIMEOUT |
Request timeout in milliseconds | No |
USE_AIBROKER |
Use AiBroker instead of OpenRouter (true/1) |
No |
AIBROKER_URL |
AiBroker service URL | No |
BRAVE_API_KEYS |
Comma-separated Brave Search API keys | No |
SEARXNG_URL |
SearXNG instance URL | No |
EXA_API_KEYS |
Comma-separated Exa API keys | No |
SERPER_API_KEYS |
Comma-separated Serper API keys | No |
SERPAPI_API_KEYS |
Comma-separated SerpAPI keys | No |
SCRAPERAPI_API_KEYS |
Comma-separated ScraperAPI keys | No |
SCRAPFLY_API_KEYS |
Comma-separated Scrapfly API keys | No |
MODELS |
Comma-separated list of LLM model identifiers | No |
LOG_LEVEL |
Logging level (debug, info, warn, error) |
No |
Socket Paths
| Service | Default Path |
|---|---|
| Server | ~/hero/var/sockets/hero_researcher_server.sock |
| UI | ~/hero/var/sockets/hero_researcher_ui.sock |
Job Storage
Job files are persisted as JSON at ~/hero/var/hero_researcher/jobs/<jobId>.json.
API Reference
The server exposes a JSON-RPC 2.0 API over a Unix domain socket. See docs/RPC_API.md for the full reference.
| Method | Description |
|---|---|
rpc.discover |
OpenRPC service discovery |
rpc.health |
Health check (status, uptime) |
research.start |
Start a new research job |
research.status |
Get job status by ID |
research.get |
Get completed research result |
research.list |
List jobs (with optional filters) |
research.cancel |
Cancel a pending/running job |
Development
# Install dependencies
make install
# Run all tests
make test
# Lint
make lint
# Format code
make format
# Build all packages
make build
# Start server + UI together
make dev
# Clean build artifacts
make clean
Disclaimer
This tool uses AI models to search for publicly available information. Results may be inaccurate, outdated, or hallucinated. This is not an official background check. Always verify information from authoritative sources. Respect privacy and applicable laws.
License
MIT