No description
- TypeScript 87.8%
- HTML 11.8%
- Makefile 0.4%
|
|
||
|---|---|---|
| src | ||
| tests | ||
| .env.example | ||
| .gitignore | ||
| bun.lock | ||
| docker-compose.yml | ||
| Makefile | ||
| package.json | ||
| README.md | ||
| tsconfig.json | ||
Hero Researcher
AI-powered research assistant using OpenRouter to query multiple LLM models for public information about people.
Features
- Multi-model queries - Query multiple AI models simultaneously for comprehensive results
- Beast Mode - 64 queries across 16 models and 4 investigation strategies
- Online Mode - Real-time web search for up-to-date information
- Verbose Mode - Real-time progress visibility with response previews
- Markdown Reports - Clean, structured output
- Cost-optimized - Models ordered by price (cheapest first)
Installation
# Clone and enter directory
cd hero-researcher
# Install dependencies
bun install
Quick Start
# Set your OpenRouter API key
export OPENROUTER_API_KEY=your_key_here
# Basic search (4 models, ~$0.10/M tokens)
bun run src/cli.ts "John Doe" --location "San Francisco"
# Save to file
bun run src/cli.ts "John Doe" -o report.md
# With aliases
bun run src/cli.ts "John Doe" --aliases "J. Doe, Johnny Doe"
Usage
Basic Search
bun run src/cli.ts "John Doe" --location "New York"
Uses 4 cost-effective models:
qwen/qwen3-32b- $0.08/M tokensmeta-llama/llama-4-scout- $0.08/M tokensmeta-llama/llama-3.3-70b-instruct- $0.10/M tokensdeepseek/deepseek-v3.2- $0.25/M tokens
Beast Mode
bun run src/cli.ts "John Doe" --beast
16 models x 4 strategies = 64 queries
Strategies:
basic- General background researchdeep- OSINT investigation (social media, court records, patents)social- Social media focused searchprofessional- Employment, education, business connections
Models (ordered by cost):
| Model | Cost/M tokens |
|---|---|
| qwen/qwen3-32b | $0.08 |
| meta-llama/llama-4-scout | $0.08 |
| meta-llama/llama-3.3-70b-instruct | $0.10 |
| deepseek/deepseek-v3.2 | $0.25 |
| google/gemini-2.5-flash | $0.30 |
| qwen/qwen3-235b-a22b | $0.30 |
| z-ai/glm-4.7 | $0.40 |
| mistralai/mistral-medium-3.1 | $0.40 |
| deepseek/deepseek-r1-0528 | $0.40 |
| google/gemini-3-flash-preview | $0.50 |
| mistralai/mistral-large-2512 | $0.50 |
| z-ai/glm-5 | $0.75 |
| qwen/qwen3-max | $1.20 |
| openai/gpt-5.2 | $1.75 |
| google/gemini-3-pro-preview | $2.00 |
| anthropic/claude-sonnet-4.5 | $3.00 |
Online Mode
bun run src/cli.ts "John Doe" --online
Adds :online suffix to models for real-time web search capability.
Verbose Mode
bun run src/cli.ts "John Doe" -v
Shows detailed progress:
- Status for each query
- Progress counter (1/4, 2/4, etc.)
- Model name and strategy
- First 200 characters of each response
- Error messages for failed queries
Custom Models
bun run src/cli.ts "John Doe" --models "anthropic/claude-sonnet-4.5,google/gemini-3-pro-preview"
Full Beast Mode
bun run src/cli.ts "John Doe" --beast --online -v -o report.md
Maximum intelligence gathering with real-time search and detailed progress.
CLI Options
| Option | Description |
|---|---|
NAME |
Name of the person to research (required) |
--location |
Known location to narrow search |
--aliases |
Comma-separated list of known aliases |
--models |
Comma-separated list of custom models |
--beast |
Beast mode: 16 models x 4 strategies = 64 queries |
--online |
Enable real-time web search |
-v, --verbose |
Show detailed progress with response previews |
-o, --output |
Save report to file |
Project Structure
hero-researcher/
├── src/
│ ├── cli.ts # CLI interface (Commander)
│ ├── client.ts # OpenRouter API client
│ ├── researcher.ts # Research logic and prompts
│ └── formatter.ts # Markdown report formatter
├── tests/
│ └── backgroundchecker.test.ts
├── package.json
├── tsconfig.json
├── Makefile
├── README.md
└── .env.example
API Reference
OpenRouterClient
import { OpenRouterClient } from "./src/client";
const client = new OpenRouterClient("your-api-key");
const response = await client.chat(
"qwen/qwen3-32b",
[
{ role: "system", content: "You are a researcher." },
{ role: "user", content: "Search for..." },
],
);
Researcher
import { Researcher } from "./src/researcher";
import type { Person } from "./src/researcher";
const person: Person = {
name: "John Doe",
location: "New York",
aliases: ["J. Doe"],
};
const researcher = new Researcher(client, { beastMode: true, online: true });
// Stream results
for await (const result of researcher.researchStream(person)) {
console.log(`${result.model}: ${result.success}`);
}
// Or get all at once
const results = await researcher.research(person);
formatMarkdownReport
import { formatMarkdownReport } from "./src/formatter";
const report = formatMarkdownReport("John Doe", results);
console.log(report);
Environment Variables
| Variable | Description | Required |
|---|---|---|
OPENROUTER_API_KEY |
Your OpenRouter API key | Yes |
Copy .env.example to .env and add your key:
cp .env.example .env
# Edit .env with your API key
Getting an OpenRouter API Key
- Go to https://openrouter.ai
- Create an account
- Navigate to API Keys
- Generate a new key
- Set it as
OPENROUTER_API_KEYenvironment variable
Development
Running Tests
bun test
Linting
make lint
Formatting
make format
Disclaimer
This tool uses AI models to search for publicly available information.
- Results may be inaccurate, outdated, or hallucinated
- This is NOT an official background check
- Always verify information from official sources
- Respect privacy and applicable laws
- Do not use for illegal purposes
License
MIT