examples | ||
src | ||
.gitignore | ||
Cargo.lock | ||
Cargo.toml | ||
README.md |
rhailib: Distributed Rhai Scripting for HeroModels
rhailib
provides a robust infrastructure for executing Rhai scripts in a distributed manner, primarily designed to integrate with and extend the HeroModels ecosystem. It allows for dynamic scripting capabilities, offloading computation, and enabling flexible automation.
Overview
The rhailib
system is composed of three main components working together, leveraging Redis for task queuing and state management:
-
Rhai Engine (
src/engine
): This crate is the core of the scripting capability. It provides a Rhai engine pre-configured with various HeroModels modules (e.g., Calendar, Flow, Legal). Scripts executed within this engine can interact directly with HeroModels data and logic. Theengine
is utilized by therhai_worker
to process tasks. -
Rhai Client (
src/client
): This crate offers an interface for applications to submit Rhai scripts as tasks to the distributed execution system. Clients can send scripts to named Redis queues (referred to as "circles"), optionally wait for results, and handle timeouts. -
Rhai Worker (
src/worker
): This executable component listens to one or more Redis queues ("circles") for incoming tasks. When a task (a Rhai script) is received, the worker fetches its details, uses therhai_engine
to execute the script, and then updates the task's status and results back into Redis. Multiple worker instances can be deployed to scale script execution.
Architecture & Workflow
The typical workflow is as follows:
- Task Submission: An application using
rhai_client
submits a Rhai script to a specific Redis list (e.g.,rhai:queue:my_circle
). Task details, including the script and status, are stored in a Redis hash. - Task Consumption: A
rhai_worker
instance, configured to listen torhai:queue:my_circle
, picks up the task ID from the queue using a blocking pop operation. - Script Execution: The worker retrieves the script from Redis and executes it using an instance of the
rhai_engine
. This engine provides the necessary HeroModels context for the script. - Result Storage: Upon completion (or error), the worker updates the task's status (e.g.,
completed
,failed
) and stores any return value or error message in the corresponding Redis hash. - Result Retrieval (Optional): The
rhai_client
can poll the Redis hash for the task's status and retrieve the results once available.
This architecture allows for:
- Asynchronous script execution.
- Scalable processing of Rhai scripts by running multiple workers.
- Decoupling of script submission from execution.
Project Structure
The core components are organized as separate crates within the src/
directory:
src/client/
: Contains therhai_client
library.src/engine/
: Contains therhai_engine
library.src/worker/
: Contains therhai_worker
library and its executable.
Each of these directories contains its own README.md
file with more detailed information about its specific functionality, setup, and usage.
Getting Started
To work with this project:
- Ensure you have Rust and Cargo installed.
- A running Redis instance is required for the
client
andworker
components to communicate. - Explore the individual README files in
src/client/
,src/worker/
, andsrc/engine/
for detailed instructions on building, configuring, and running each component.
You can typically build all components using:
cargo build --workspace
Or build and run specific examples or binaries as detailed in their respective READMEs.
Purpose
rhailib
aims to provide a flexible and powerful way to extend applications with custom logic written in Rhai, executed in a controlled and scalable environment. This is particularly useful for tasks such as:
- Implementing dynamic business rules.
- Automating processes.
- Running background computations.
- Customizing application behavior without recompilation.