Files
herodb/docs/lance.md
2025-09-29 14:55:41 +02:00

444 lines
11 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

# Lance Vector Backend (RESP + JSON-RPC)
This document explains how to use HeroDBs Lance-backed vector store. It is text-first: users provide text, and HeroDB computes embeddings server-side (no manual vectors). It includes copy-pasteable RESP (redis-cli) and JSON-RPC examples for:
- Creating a Lance database
- Embedding provider configuration (OpenAI, Azure OpenAI, or deterministic test provider)
- Dataset lifecycle: CREATE, LIST, INFO, DROP
- Ingestion: STORE text (+ optional metadata)
- Search: QUERY with K, optional FILTER and RETURN
- Delete by id
- Index creation (currently a placeholder/no-op)
References:
- Implementation: [src/lance_store.rs](src/lance_store.rs), [src/cmd.rs](src/cmd.rs), [src/rpc.rs](src/rpc.rs), [src/server.rs](src/server.rs), [src/embedding.rs](src/embedding.rs)
Notes:
- Admin DB 0 cannot be Lance (or Tantivy). Only databases with id >= 1 can use Lance.
- Permissions:
- Read operations (SEARCH, LIST, INFO) require read permission.
- Mutating operations (CREATE, STORE, CREATEINDEX, DEL, DROP, EMBEDDING CONFIG SET) require readwrite permission.
- Backend gating:
- If a DB is Lance, only LANCE.* and basic control commands (PING, ECHO, SELECT, INFO, CLIENT, etc.) are permitted.
- If a DB is not Lance, LANCE.* commands return an error.
Storage layout and schema:
- Files live at: <base_dir>/lance/<db_id>/<dataset>.lance
- Records schema:
- id: Utf8 (non-null)
- vector: FixedSizeList<Float32, dim> (non-null)
- text: Utf8 (nullable)
- meta: Utf8 JSON (nullable)
- Search is an L2 KNN brute-force scan for now (lower score = better). Index creation is a no-op placeholder to be implemented later.
Prerequisites:
- Start HeroDB with RPC enabled (for management calls):
- See [docs/basics.md](./basics.md) for flags. Example:
```bash
./target/release/herodb --dir /tmp/herodb --admin-secret mysecret --port 6379 --enable-rpc
```
## 0) Create a Lance-backed database (JSON-RPC)
Use the management API to create a database with backend "Lance". DB 0 is reserved for admin and cannot be Lance.
Request:
```json
{
"jsonrpc": "2.0",
"id": 1,
"method": "herodb_createDatabase",
"params": [
"Lance",
{ "name": "vectors-db", "storage_path": null, "max_size": null, "redis_version": null },
null
]
}
```
- Response contains the allocated db_id (>= 1). Use that id below (replace 1 with your actual id).
Select the database over RESP:
```bash
redis-cli -p 6379 SELECT 1
# → OK
```
## 1) Configure embedding provider (server-side embeddings)
HeroDB embeds text internally at STORE/SEARCH time using a per-dataset EmbeddingConfig sidecar. Configure provider before creating a dataset to choose dimensions and provider.
Supported providers:
- openai (standard OpenAI or Azure OpenAI)
- testhash (deterministic, CI-friendly; no network)
Environment variables for OpenAI:
- Standard OpenAI: export OPENAI_API_KEY=sk-...
- Azure OpenAI: export AZURE_OPENAI_API_KEY=...
RESP examples:
```bash
# Standard OpenAI with default dims (model-dependent, e.g. 1536)
redis-cli -p 6379 LANCE.EMBEDDING CONFIG SET myset PROVIDER openai MODEL text-embedding-3-small
# OpenAI with reduced output dimension (e.g., 512) when supported
redis-cli -p 6379 LANCE.EMBEDDING CONFIG SET myset PROVIDER openai MODEL text-embedding-3-small PARAM dim 512
# Azure OpenAI (set env: AZURE_OPENAI_API_KEY)
redis-cli -p 6379 LANCE.EMBEDDING CONFIG SET myset PROVIDER openai MODEL text-embedding-3-small \
PARAM use_azure true \
PARAM azure_endpoint https://myresource.openai.azure.com \
PARAM azure_deployment my-embed-deploy \
PARAM azure_api_version 2024-02-15 \
PARAM dim 512
# Deterministic test provider (no network, stable vectors)
redis-cli -p 6379 LANCE.EMBEDDING CONFIG SET myset PROVIDER testhash MODEL any
```
Read config:
```bash
redis-cli -p 6379 LANCE.EMBEDDING CONFIG GET myset
# → JSON blob describing provider/model/params
```
JSON-RPC examples:
```json
{
"jsonrpc": "2.0",
"id": 2,
"method": "herodb_lanceSetEmbeddingConfig",
"params": [
1,
"myset",
"openai",
"text-embedding-3-small",
{ "dim": "512" }
]
}
```
```json
{
"jsonrpc": "2.0",
"id": 3,
"method": "herodb_lanceGetEmbeddingConfig",
"params": [1, "myset"]
}
```
## 2) Create a dataset
Choose a dimension that matches your embedding configuration. For OpenAI text-embedding-3-small without dimension override, typical dimension is 1536; when `dim` is set (e.g., 512), use that. The current API requires an explicit DIM.
RESP:
```bash
redis-cli -p 6379 LANCE.CREATE myset DIM 512
# → OK
```
JSON-RPC:
```json
{
"jsonrpc": "2.0",
"id": 4,
"method": "herodb_lanceCreate",
"params": [1, "myset", 512]
}
```
## 3) Store text documents (server-side embedding)
Provide your id, the text to embed, and optional META fields. The server computes the embedding using the configured provider and stores id/vector/text/meta in the Lance dataset. Upserts by id are supported via delete-then-append semantics.
RESP:
```bash
redis-cli -p 6379 LANCE.STORE myset ID doc-1 TEXT "Hello vector world" META title "Hello" category "demo"
# → OK
```
JSON-RPC:
```json
{
"jsonrpc": "2.0",
"id": 5,
"method": "herodb_lanceStoreText",
"params": [
1,
"myset",
"doc-1",
"Hello vector world",
{ "title": "Hello", "category": "demo" }
]
}
```
## 4) Search with a text query
Provide a query string; the server embeds it and performs KNN search. Optional: FILTER expression and RETURN subset of fields.
RESP:
```bash
# K nearest neighbors for the query text
redis-cli -p 6379 LANCE.SEARCH myset K 5 QUERY "greetings to vectors"
# → Array of hits: [id, score, [k,v, ...]] pairs, lower score = closer
# With a filter on meta fields and return only title
redis-cli -p 6379 LANCE.SEARCH myset K 3 QUERY "greetings to vectors" FILTER "category = 'demo'" RETURN 1 title
```
JSON-RPC:
```json
{
"jsonrpc": "2.0",
"id": 6,
"method": "herodb_lanceSearchText",
"params": [1, "myset", "greetings to vectors", 5, null, null]
}
```
With filter and selected fields:
```json
{
"jsonrpc": "2.0",
"id": 7,
"method": "herodb_lanceSearchText",
"params": [1, "myset", "greetings to vectors", 3, "category = 'demo'", ["title"]]
}
```
Response shape:
- RESP over redis-cli: an array of hits [id, score, [k, v, ...]].
- JSON-RPC returns an object containing the RESP-encoded wire format string or a structured result depending on implementation. See [src/rpc.rs](src/rpc.rs) for details.
## 5) Create an index (placeholder)
Index creation currently returns OK but is a no-op. It will integrate Lance vector indices in a future update.
RESP:
```bash
redis-cli -p 6379 LANCE.CREATEINDEX myset TYPE "ivf_pq" PARAM nlist 100 PARAM pq_m 16
# → OK (no-op for now)
```
JSON-RPC:
```json
{
"jsonrpc": "2.0",
"id": 8,
"method": "herodb_lanceCreateIndex",
"params": [1, "myset", "ivf_pq", { "nlist": "100", "pq_m": "16" }]
}
```
## 6) Inspect datasets
RESP:
```bash
# List datasets in current Lance DB
redis-cli -p 6379 LANCE.LIST
# Get dataset info
redis-cli -p 6379 LANCE.INFO myset
```
JSON-RPC:
```json
{
"jsonrpc": "2.0",
"id": 9,
"method": "herodb_lanceList",
"params": [1]
}
```
```json
{
"jsonrpc": "2.0",
"id": 10,
"method": "herodb_lanceInfo",
"params": [1, "myset"]
}
```
## 7) Delete and drop
RESP:
```bash
# Delete by id
redis-cli -p 6379 LANCE.DEL myset doc-1
# → OK
# Drop the entire dataset
redis-cli -p 6379 LANCE.DROP myset
# → OK
```
JSON-RPC:
```json
{
"jsonrpc": "2.0",
"id": 11,
"method": "herodb_lanceDel",
"params": [1, "myset", "doc-1"]
}
```
```json
{
"jsonrpc": "2.0",
"id": 12,
"method": "herodb_lanceDrop",
"params": [1, "myset"]
}
```
## 8) End-to-end example (RESP)
```bash
# 1. Select Lance DB (assume db_id=1 created via RPC)
redis-cli -p 6379 SELECT 1
# 2. Configure embedding provider (OpenAI small model at 512 dims)
redis-cli -p 6379 LANCE.EMBEDDING CONFIG SET myset PROVIDER openai MODEL text-embedding-3-small PARAM dim 512
# 3. Create dataset
redis-cli -p 6379 LANCE.CREATE myset DIM 512
# 4. Store documents
redis-cli -p 6379 LANCE.STORE myset ID doc-1 TEXT "The quick brown fox jumps over the lazy dog" META title "Fox" category "animal"
redis-cli -p 6379 LANCE.STORE myset ID doc-2 TEXT "A fast auburn fox vaulted a sleepy canine" META title "Fox paraphrase" category "animal"
# 5. Search
redis-cli -p 6379 LANCE.SEARCH myset K 2 QUERY "quick brown fox" RETURN 1 title
# 6. Dataset info and listing
redis-cli -p 6379 LANCE.INFO myset
redis-cli -p 6379 LANCE.LIST
# 7. Delete and drop
redis-cli -p 6379 LANCE.DEL myset doc-2
redis-cli -p 6379 LANCE.DROP myset
```
## 9) End-to-end example (JSON-RPC)
Assume RPC server on port 8080. Replace ids and ports as needed.
1) Create Lance DB:
```json
{
"jsonrpc": "2.0",
"id": 100,
"method": "herodb_createDatabase",
"params": ["Lance", { "name": "vectors-db", "storage_path": null, "max_size": null, "redis_version": null }, null]
}
```
2) Set embedding config:
```json
{
"jsonrpc": "2.0",
"id": 101,
"method": "herodb_lanceSetEmbeddingConfig",
"params": [1, "myset", "openai", "text-embedding-3-small", { "dim": "512" }]
}
```
3) Create dataset:
```json
{
"jsonrpc": "2.0",
"id": 102,
"method": "herodb_lanceCreate",
"params": [1, "myset", 512]
}
```
4) Store text:
```json
{
"jsonrpc": "2.0",
"id": 103,
"method": "herodb_lanceStoreText",
"params": [1, "myset", "doc-1", "The quick brown fox jumps over the lazy dog", { "title": "Fox", "category": "animal" }]
}
```
5) Search text:
```json
{
"jsonrpc": "2.0",
"id": 104,
"method": "herodb_lanceSearchText",
"params": [1, "myset", "quick brown fox", 2, null, ["title"]]
}
```
6) Info/list:
```json
{
"jsonrpc": "2.0",
"id": 105,
"method": "herodb_lanceInfo",
"params": [1, "myset"]
}
```
```json
{
"jsonrpc": "2.0",
"id": 106,
"method": "herodb_lanceList",
"params": [1]
}
```
7) Delete/drop:
```json
{
"jsonrpc": "2.0",
"id": 107,
"method": "herodb_lanceDel",
"params": [1, "myset", "doc-1"]
}
```
```json
{
"jsonrpc": "2.0",
"id": 108,
"method": "herodb_lanceDrop",
"params": [1, "myset"]
}
```
## 10) Operational notes and troubleshooting
- If using OpenAI and you see “missing API key env”, set:
- Standard: `export OPENAI_API_KEY=sk-...`
- Azure: `export AZURE_OPENAI_API_KEY=...` and pass `use_azure true`, `azure_endpoint`, `azure_deployment`, `azure_api_version`.
- Dimensions mismatch:
- Ensure the dataset DIM equals the providers embedding dim. For OpenAI text-embedding-3 models, set `PARAM dim 512` (or another supported size) and use that same DIM for `LANCE.CREATE`.
- DB 0 restriction:
- Lance is not allowed on DB 0. Use db_id >= 1.
- Permissions:
- Read operations (SEARCH, LIST, INFO) require read permission.
- Mutations (CREATE, STORE, CREATEINDEX, DEL, DROP, EMBEDDING CONFIG SET) require readwrite permission.
- Backend gating:
- On Lance DBs, only LANCE.* commands are accepted (plus basic control).
- Current index behavior:
- `LANCE.CREATEINDEX` returns OK but is a no-op. Future versions will integrate Lance vector indices.
- Implementation files for reference:
- [src/lance_store.rs](src/lance_store.rs), [src/cmd.rs](src/cmd.rs), [src/rpc.rs](src/rpc.rs), [src/server.rs](src/server.rs), [src/embedding.rs](src/embedding.rs)